1. A lightweight deep learning model for profiled SCA based on random convolution kernelsYu Ou, Yongzhuang Wei, René Rodríguez, Fengrong Zhang, 2025, original scientific article Abstract: In deep learning-based side-channel analysis (DL-SCA), there may be a proliferation of model parameters as the number of trace power points increases, especially in the case of raw power traces. Determining how to design a lightweight deep learning model that can handle a trace with more power points and has fewer parameters and lower time costs for profiled SCAs appears to be a challenge. In this article, a DL-SCA model is proposed by introducing a non-trained DL technique called random convolutional kernels, which allows us to extract the features of leakage like using a transformer model. The model is then processed by a classifier with an attention mechanism, which finally outputs the probability vector for the candidate keys. Moreover, we analyze the performance and complexity of the random kernels and discuss how they work in theory. On several public AES datasets, the experimental results show that the number of required profiling traces and trainable parameters reduce, respectively, by over 70% and 94% compared with state-of-the-art works, while ensuring that the number of power traces required to recover the real key is acceptable. Importantly, differing from previous SCA models, our architecture eliminates the dependency between the feature length of power traces and the number of trainable parameters, which allows for the architecture to be applied to the case of raw power traces. Keywords: side-channel analysis, deep learning, convolution neural networks, random convolution kernel Published in RUP: 26.09.2025; Views: 1413; Downloads: 6
Full text (1,75 MB) This document has more files! More... |
2. The algebraic characterization of M-subspaces of bent concatenations and its applicationSadmir Kudin, Enes Pašalić, Alexandr Polujan, Fengrong Zhang, 2025, original scientific article Abstract: Every Boolean bent function f can be written either as a concatenation f = f1|| f2 of two complementary semi-bent functions f1, f2; or as a concatenation f = f1|| f2|| f3|| f4 of four Boolean functions f1, f2, f3, f4, all of which are simultaneously bent, semi-bent, or 5-valued spectra-functions. In this context, it is essential to specify conditions for these bent concatenations so that f does (not) belong to the completed Maiorana-McFarland class M#. In this article, we resolve this question completely by providing the algebraic characterization of M-subspaces for the concatenation of the form f = f1|| f2 and f = f1|| f2|| f3|| f4, which allows us to estimate ind( f ), the linearity index of f, and consequently to establish the necessary and sufficient conditions so that f is outside M#. Based on these conditions, we propose several explicit and generic design methods of specifying bent functions outside M# in the special case when f = g||h||g||(h+1), where g and h are bent functions. Moreover, we show that it is possible to even decrease the linearity index of f = g||h||g||(h+1), compared to ind(g) and ind(h), if the largest dimension of a common M-subspace of g and h is small enough (less than min{ind(g), ind(h)} − 1). This also induces iterative methods of constructing bent functions outside M# with (controllable) low linearity index. Finally, we derive a lower bound on the 2-rank of f and show that this concatenation method can generate bent functions that are provably outside M# ∪ PS# ap. In difference to the approach of Weng et al. (2007) that uses the direct sum and a bent function g outside M#, our method employs g, h ∈ M# for the same purpose. Keywords: bent function, Maiorana-McFarland class, M-subspaces Published in RUP: 04.08.2025; Views: 420; Downloads: 5
Full text (349,74 KB) This document has more files! More... |
3. |
4. |
5. |
6. |
7. |
8. |
9. |
10. |