Neural Networks
Machine learning technique modeled after the human brain's structure, consisting of interconnected nodes that process data using weights and activation functions.
Fundamental to modern AI, powering advancements in recognition and processing tasks.
Deep Neural Networks (DNNs)
Neural networks with multiple layers enabling hierarchical learning of features from raw data.
Revolutionized AI by achieving state-of-the-art performance in complex tasks.
Convolutional Neural Networks (CNNs)
Deep learning models designed for processing grid-like data such as images, using convolutional layers to extract hierarchical features.
Foundational to modern computer vision tasks and image recognition.
Recurrent Neural Networks (RNNs)
Neural networks designed for sequential data processing, where previous step outputs are used as current step inputs.
Essential for tasks involving sequential or time-series data.
LSTM (Long Short-Term Memory)
Specialized RNN designed to overcome limitations by effectively managing long-term dependencies in sequential data.
Addresses vanishing gradient problem, highly effective for sequence-based tasks.
Transformers
Neural network architecture for sequence-to-sequence tasks, leveraging self-attention mechanisms to process input data efficiently.
Revolutionized AI by enabling scalable training and state-of-the-art results.
Autoencoders
Neural networks designed to learn efficient, low-dimensional representations by encoding input into compressed form and reconstructing it.
Widely used for dimensionality reduction, anomaly detection, and data denoising.
GANs (Generative Adversarial Networks)
Neural network architecture with two competing networks: a generator creating synthetic data and a discriminator evaluating authenticity.
Revolutionized content creation, enabling generation of highly realistic synthetic data.