11# CLUT: Compressed Representation of 3DLUT
2- > Two attempts to compress 3DLUTs via learning: low-rank decomposition and hash.
2+ > Two attempts to compress 3DLUTs via learning: low-rank decomposition and hash. ** Higher performance with much smaller models!** ☺️
3+
34### [ ** CLUT-Net: Learning Adaptively Compressed Representations of 3DLUTs for Lightweight Image Enhancement** ] ( https://doi.org/10.1145/3503161.3547879 )
45- ** Fengyi Zhang** , [ Hui Zeng] ( https://huizeng.github.io/ ) , [ Tianjun Zhang] ( https://github.com/z619850002 ) , [ Lin Zhang] ( https://cslinzhang.gitee.io/home/ )
56- * ACMMM2022*
67
7- #### ![ ] ( / doc/overview_mm.png)
8+ #### ![ ] ( doc/overview_mm.png )
89Framework of our proposed CLUT-Net which consists of
910- A neural network
1011- * N* basis CLUTs
@@ -18,7 +19,7 @@ The *N* basis CLUTs cover various enhancement effects required in different scen
1819### [ ** Adaptively Hashing 3DLUTs for Lightweight Real-time Image Enhancement** ] ( /doc/23ICME_camera_ready_eXpress.pdf )
1920- ** Fengyi Zhang** , [ Lin Zhang] ( https://cslinzhang.gitee.io/home/ ) , [ Tianjun Zhang] ( https://github.com/z619850002 ) , Dongqing Wang
2021- * ICME2023*
21- #### ![ ] ( / doc/overview_icme.png)
22+ #### ![ ] ( doc/overview_icme.png )
2223Framework of our proposed HashLUT-based image enhancement network which contains
2324- N progressive basis HashLUTs
2425- A collision-compensation network
@@ -101,7 +102,7 @@ To evaluate your own trained model of a specific epoch, specify the epoch and ke
101102![ ] ( doc/3D.png )
102103![ ] ( doc/3D_2.png )
103104
104- - Grid occupancy
105+ - Grid occupancy visualization
105106![ ] ( doc/distribution_illu.png )
106107
107108All the visualization codes could be found in [ utils/] ( ./utils/ ) .
@@ -114,7 +115,9 @@ The multi-resolution HashLUTs are implemented based on the fast hash encoding of
114115
115116Great appreciation to the above work and all collaborators for their efforts!
116117
117- And hope our work helps! 🌟
118+ And thanks for your interest!
119+
120+ Sincerely hope our work helps! 🌟 🔔 📌
118121
119122### BibTeX
120123 @inproceedings{clutnet,
@@ -128,6 +131,16 @@ And hope our work helps! 🌟
128131 pages = {6493–6501},
129132 numpages = {9},
130133 }
134+ ---
135+ @INPROCEEDINGS{hashlut,
136+ author={Zhang, Fengyi and Zhang, Lin and Zhang, Tianjun and Wang, Dongqing},
137+ booktitle={2023 IEEE International Conference on Multimedia and Expo (ICME)},
138+ title={Adaptively Hashing 3DLUTs for Lightweight Real-time Image Enhancement},
139+ year={2023},
140+ volume={},
141+ number={},
142+ pages={2771-2776},
143+ doi={10.1109/ICME55011.2023.00471}}
131144
132145
133146
0 commit comments