Posts

Showing posts from May, 2023

Training the Shallow Neural Network

Image
Training using the Lavenberg-Marquardt Backpropagation Algorithm   In mathematics and computing, the Levenberg–Marquardt algorithm (LMA or just LM), also known as the damped least-squares (DLS) method, is used to solve non-linear least squares problems. These minimization problems arise especially in least squares curve fitting [1]. This shallow neural network has 10 hidden layers.  Here's a step-by-step explanation of how the Levenberg-Marquardt backpropagation algorithm works: Initialization: Initialize the weights and biases of the neural network with small random values. Forward Propagation: Feed a training input through the network and compute the corresponding output using the current weights and biases. This involves propagating the input forward through each layer of the network, applying activation functions and computing the output of each neuron. Compute the error: Compare the computed output with the desired outp...

Feature Engineering and Dimensionality Reduction

Image
Research  After the pre-recorded data in the dataset was distinguished, in order to cater for other elements that are typical to affecting the discharge capacitance of supercapacitors, the group carried out research and literature review to determine the various other features that can be incorporated into the dataset.  Some of these engineered features are illustrated in the table bellow: 1) Voltage decrease + IR drop Calculated using the following formula:  When supercapacitors are rapidly charged and discharged it causes current to flow through its internal resistance and alternately cause a drop in the voltage which is known as IR drop [1] . This drop becomes more significant as the current increases. This can lead to a decrease in the voltage of the supercapacitor, which is known as voltage decrease. The voltage decrease can result in a reduction in the amount of energy that the supercapacitor can store and deliver 2) Fit...