Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2 
Published in ICCV 2023, 2023
Contrastive learning framework for robust deepfake detection and generalization to unseen synthetic media.
Recommended citation: Nicolas Larue, Ngoc-Son Vu, Vitomir Štruc, Peter Peer, Vassilis Christophides. SeeABLE: Soft Discrepancies and Bounded Contrastive Learning for Exposing Deepfakes. ICCV 2023. https://openaccess.thecvf.com/content/ICCV2023/html/Larue_SeeABLE_Soft_Discrepancies_and_Bounded_Contrastive_Learning_for_Exposing_Deepfakes_ICCV_2023_paper.html
Published in CVPR 2024, 2024
Efficient neural architecture for sparse-view CT reconstruction, inspired by Quasi-Newton optimization.
Recommended citation: Ishak Ayad, Nicolas Larue, Maï K. Nguyen. QN-Mixer: A Quasi-Newton MLP-Mixer Model for Sparse-View CT Reconstruction. CVPR 2024. https://towzeur.github.io/QN-Mixer/
Published in BMVC 2024, 2024
Contrastive continual learning method inspired by neural collapse.
Recommended citation: Nicolas Larue, et al. SSD: Neural Collapse Inspired Contrastive Continual Learning. BMVC 2024. https://bmva-archive.org.uk/bmvc/2024/papers/Paper_579/paper.pdf
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Teaching Assistant, CY Cergy Paris University / ENSEA, 2021
I taught and supervised 251 hours of engineering-level courses at ENSEA / CY Cergy Paris University, within the French Grande Ecole engineering system. ENSEA is a selective graduate engineering school specializing in electronics, computer science, signal processing, embedded systems, and artificial intelligence. Its engineering curriculum is master’s-equivalent and trains students for technical roles across software, AI, hardware, telecommunications, and systems engineering.
My teaching covered machine learning, deep learning, software engineering, algorithms, systems, and hardware-oriented signal processing. Most of my work was in practical sessions, labs, and projects, where students had to implement, debug, and reason about technical systems rather than only learn theory.
I helped students bridge the gap between mathematical concepts, code, and working systems - from algorithmic reasoning and software design to deep learning experiments, systems programming, and FPGA/DSP-oriented signal processing. This experience strengthened my ability to explain complex technical ideas clearly, diagnose implementation issues quickly, and communicate across different levels of abstraction.
| Academic year | Course | Type | Hours |
|---|---|---|---|
| 2024-2025 | Deep Learning | Lecture + Lab | 23 |
| 2024-2025 | Hardware for Signal Processing | Lecture + Lab | 38 |
| 2023-2024 | Hardware for Signal Processing | Lecture + Lab | 38 |
| 2023-2024 | Systems and Networks | Lab | 16 |
| 2023-2024 | Transversal Software Project - Algorithms | Lab | 8 |
| 2022-2023 | Algorithms | Lab | 24 |
| 2022-2023 | Systems-on-Chip | Lab | 24 |
| 2022-2023 | Engineering Project | Lab | 16 |
| 2021-2022 | Software Engineering | Lab | 32 |
| 2021-2022 | Algorithms | Lab | 24 |
| 2021-2022 | Artificial Intelligence and Big Data | Lab | 8 |
Across these courses, I worked with engineering students on practical AI, software, and systems problems. This teaching experience strengthened my ability to explain technical ideas clearly, diagnose implementation issues quickly, and communicate across different levels of abstraction - from low-level systems and hardware constraints to high-level machine learning concepts.