| Title: | A lightweight deep learning model for profiled SCA based on random convolution kernels |
|---|
| Authors: | ID Ou, Yu (Author) ID Wei, Yongzhuang (Author) ID Rodríguez, René (Author) ID Zhang, Fengrong (Author) |
| Files: | RAZ_Ou_Yu_2025.pdf (1,75 MB) MD5: 7CD2975FED8077F681AC93A093FB01FA
https://www.mdpi.com/2078-2489/16/5/351
|
|---|
| Language: | English |
|---|
| Work type: | Article |
|---|
| Typology: | 1.01 - Original Scientific Article |
|---|
| Organization: | FAMNIT - Faculty of Mathematics, Science and Information Technologies
|
|---|
| Abstract: | In deep learning-based side-channel analysis (DL-SCA), there may be a proliferation of model parameters as the number of trace power points increases, especially in the case of raw power traces. Determining how to design a lightweight deep learning model that can handle a trace with more power points and has fewer parameters and lower time costs for profiled SCAs appears to be a challenge. In this article, a DL-SCA model is proposed by introducing a non-trained DL technique called random convolutional kernels, which allows us to extract the features of leakage like using a transformer model. The model is then processed by a classifier with an attention mechanism, which finally outputs the probability vector for the candidate keys. Moreover, we analyze the performance and complexity of the random kernels and discuss how they work in theory. On several public AES datasets, the experimental results show that the number of required profiling traces and trainable parameters reduce, respectively, by over 70% and 94% compared with state-of-the-art works, while ensuring that the number of power traces required to recover the real key is acceptable. Importantly, differing from previous SCA models, our architecture eliminates the dependency between the feature length of power traces and the number of trainable parameters, which allows for the architecture to be applied to the case of raw power traces. |
|---|
| Keywords: | side-channel analysis, deep learning, convolution neural networks, random convolution kernel |
|---|
| Publication date: | 27.04.2025 |
|---|
| Year of publishing: | 2025 |
|---|
| Number of pages: | str. 1-20 |
|---|
| Numbering: | Vol. 16, iss. 5, [article no.] 351 |
|---|
| PID: | 20.500.12556/RUP-21799  |
|---|
| UDC: | 004.7:004.8 |
|---|
| ISSN on article: | 2078-2489 |
|---|
| DOI: | 10.3390/info16050351  |
|---|
| COBISS.SI-ID: | 234521859  |
|---|
| Publication date in RUP: | 26.09.2025 |
|---|
| Views: | 1514 |
|---|
| Downloads: | 7 |
|---|
| Metadata: |  |
|---|
|
:
|
Copy citation |
|---|
| | | | Average score: | (0 votes) |
|---|
| Your score: | Voting is allowed only for logged in users. |
|---|
| Share: |  |
|---|
Hover the mouse pointer over a document title to show the abstract or click
on the title to get all document metadata. |