Python AI/ML
Fee : ₹29,500 + GST ₹5310, Duration : 5 months, Contact : patkar@rajeshpatkar.com
Course Content
1. Introduction to Deep Learning
Deep Learning Foundations
What is Deep Learning?
Differences between Machine Learning and Deep Learning
Applications of Deep Learning (vision, NLP, reinforcement learning, etc.)
Mathematics for Deep Learning
Linear Algebra: Vectors, matrices, and operations
Calculus: Derivatives, gradients, and optimization basics
Probability: Basic probability and distributions
Neural Networks Overview
Biological inspiration and artificial neurons
Basic architecture: Input, hidden, and output layers
Activation functions (ReLU, Sigmoid, Tanh, Softmax)
Introduction to TensorFlow and PyTorch
Installing and setting up frameworks
Basic TensorFlow operations
Introduction to PyTorch tensors
2. Fundamentals of Neural Networks
Building Blocks
Forward propagation
Backpropagation and gradient descent
Loss functions (MSE, Cross-Entropy)
Optimizers (SGD, Adam, RMSprop)
Hands-On Neural Networks
Building a neural network from scratch using NumPy
Building the same neural network using TensorFlow/Keras and PyTorch
Training and Evaluation
Train, validation, and test splits
Overfitting and underfitting
Regularization techniques: L1/L2 regularization, dropout, and data augmentation
3. Convolutional Neural Networks (CNNs) for Computer Vision
Understanding Convolutions
Convolution operations and kernels
Feature maps, pooling layers, and padding
CNN Architectures
Building basic CNNs
Key architectures: LeNet, AlexNet, VGG, ResNet
Advanced Topics in CNNs
Transfer learning: Using pre-trained models
Fine
4. Recurrent Neural Networks (RNNs) for Sequential Data
Sequential Modeling Basics
RNNs: Architecture and working principle
Challenges: Vanishing gradients and long-term dependencies
Advanced RNNs
Long Short-Term Memory (LSTM) networks
Gated Recurrent Units (GRUs)
Applications of RNNs
Time series forecasting
Text generation
5. Transformers and Attention Mechanisms
Attention Mechanisms
Understanding attention in sequential models
Self-attention and multi-head attention
Transformers Architecture
Introduction to the Transformer model
Positional encodings and scaling
Pretrained Models and Applications
BERT for text classification
GPT for text generation
Hands-On Projects:
Sentiment analysis using BERT
Fine-tuning GPT for custom text generation
6. Generative Models
Autoencoders
Encoder-decoder architecture
Applications: Dimensionality reduction and anomaly detection
Variational Autoencoders (VAEs)
Understanding probabilistic models
Applications: Generative modeling
Generative Adversarial Networks (GANs)
GAN architecture: Generator and discriminator
Applications: Image generation, style transfer
Diffusion Models
Basics of Diffusion Models
Understanding the forward and reverse diffusion processes
Noise schedules and Markov chains
Training Diffusion Models
Applications of Diffusion Models
Image synthesis
7. Deployment and Model Optimization
Model Optimization
Quantization and pruning for lightweight models
Knowledge distillation
Deployment Pipelines
Model serving using TensorFlow Serving or ONNX Runtime
Deploying models as REST APIs with Flask/FastAPI
Real-Time Model Serving
Using open-source tools like TensorFlow Serving for scalable deployment
8. Working with LLMs
Using APIs from popular LLM providers:
Fundamentals of prompt engineering:
Structuring effective prompts for specific tasks.
Fine-tuning prompts for domain-specific applications.
Fine-tuning LLMs for advanced use cases:
Overview of custom training using open-source tools.
4. AI as a Service (AIaaS)
5. Building AI-Powered Web Apps
Integrating AI models into Flask or FastAPI backends.
Consuming AI services with:
RESTful APIs
gRPC for efficient communication.
4. Real-Time AI Systems
Using WebSockets for real-time AI integrations.