Django
| Created | |
|---|---|
| Type | Web |
| Language | Python |
| Last Edit |
Basics
Installation
Install Python and Create Virtual Environment
pip install virtualenvvirtualenv venvIf we want a particular version of python, lets say 3.6, then we can use
python3.6 -m venv "my_env_name"source myvenv/bin/activate(myvenv) python3 -m pip install pip --upgrade(myvenv) deactivateInstall Packages
django
djangorestframework
pyyaml
requests
django-cors-headers
psycopg2 # required if postgres is used(myvenv) pip install -r requirements.txtSetup
(myvenv) django-admin startproject django_course_project .Virtual environment and project can have the same name.
Ctrl+Shift+P), then select the Python: Select Interpreter. From the list, select the virtual environment in your project folder that starts with .env Setup On VPS
Step 1: Update Packages
sudo apt-get update
sudo apt-get upgradeStep 2: Install, Create & Activate Virtualenv
sudo apt-get install python3-virtualenvsudo virtualenv /opt/myenvsource /opt/myenv/bin/activateStep 3: Install Django, Gunicorn & PostgreSQL Binary
pip install django gunicorn psycopg2-binaryStep 4: Install Postgres (Refer:
PostgreSQL)
Step 5: Create And Add SSL Key for Git
To clone a Git repository from a remote server (e.g., GitHub, GitLab, Bitbucket) to your VPS using SSH, you can follow these steps:
- Generate SSH Key on Your VPS:
If you haven't already, generate an SSH key pair on your VPS:
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"Follow the prompts to save the key in the default location (
~/.ssh/id_rsa) or a location of your choice.
- Copy the Public Key:
Copy the public key to your clipboard:
cat ~/.ssh/id_rsa.pubCopy the output, you will use it in the next step.
- Add SSH Key to Your Remote Git Repository:
- For GitHub: Go to your GitHub account settings > SSH and GPG keys > New SSH key, and paste the copied key.
- For GitLab: Go to your GitLab account settings > SSH Keys, and add a new SSH key with the copied key.
- For Bitbucket: Go to your Bitbucket account settings > SSH keys, and add the copied key.
- Clone the Repository on Your VPS:
git clone git@github.com:your_username/your_repository.gitReplace
git@github.com:your_username/your_repository.gitwith the SSH URL of your Git repository.
- Provide Your SSH Key Passphrase:
If you set a passphrase when generating your SSH key, you will be prompted to enter it when you try to clone the repository. Enter the passphrase, and the cloning process should proceed.
OR
Step 5: Copy Local Project Folder To Server with SCP
Run on local machine
scp -r /path/to/local/directory username@your_vps_ip:/path/on/vps/Step 6: Create settings.py
- Add
MediaandStaticfolder pathsfrom pathlib import Path import os # Build paths inside the project like this: BASE_DIR / 'subdir'. BASE_DIR = Path(__file__).resolve().parent.parent MEDIA_URL = '/media/' MEDIA_ROOT = os.path.join(BASE_DIR, 'media') STATIC_ROOT = os.path.join(BASE_DIR, 'static') STATIC_URL = '/static/'
- Edit Allowed Hosts Entry
ALLOWED_HOSTS = ['your_server_domain_or_IP', 'second_domain_or_IP', . . ., 'localhost']
- Add DB Config
DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql_psycopg2', 'NAME': 'myproject', 'USER': 'myprojectuser', 'PASSWORD': 'password', 'HOST': 'localhost', 'PORT': '', } }
Step 7: Install Packages on VPS
pip install -r requirements.txtStep 7: Migrate DB Changes
python manage.py makemigrations
python manage.py migrateSync DB (If pgadmin is not already installed)
python manage.py syncdbStep 8: Create Django Dashboard Admin User
python manage.py createsuperuserStep 9: Collect all static files to folder you configured
python manage.py collectstaticStep 10: Run & Test
Allow exception for port 8000
sudo ufw allow 8000Run server
python manage.py runserver 0.0.0.0:8000Visit <yourIPorDomain>:8000/admin
Step 11: Test Gunicorn
gunicorn --bind 0.0.0.0:8000 app.wsgiStep 12: Creating systemd Socket and Service Files for Gunicorn
Create systemd socket file for Gunicorn
sudo nano /etc/systemd/system/gunicorn.socketFile:
[Unit]
Description=gunicorn socket
[Socket]
ListenStream=/run/gunicorn.sock
[Install]
WantedBy=sockets.targetCreate systemd service file for Gunicorn
sudo nano /etc/systemd/system/gunicorn.serviceFile:
[Unit]
Description=gunicorn daemon
Requires=gunicorn.socket
After=network.target
[Service]
User=username
Group=usergroup
WorkingDirectory=/home/sammy/myprojectdir
ExecStart=/home/sammy/myprojectdir/myprojectenv/bin/gunicorn \
--access-logfile - \
--workers 3 \
--bind unix:/run/gunicorn.sock \
myproject.wsgi:application
[Install]
WantedBy=multi-user.targetExample
[Unit] Description=gunicorn daemon for school_backend After=network.target [Service] User=school-host Group=school-host WorkingDirectory=/home/school-django-app/ ExecStart=/home/school-env/bin/gunicorn app.wsgi:application --bind 172.105.53.40:8000 [Install] WantedBy=multi-user.target
Step 13: Start, Enable & Check Status of Gunicorn
sudo systemctl start gunicorn.socket
sudo systemctl enable gunicorn.socketCheck Status
sudo systemctl status gunicorn.socketCheck logs if any error is shown in status
sudo journalctl -u gunicorn.socketCheck for the existence of the gunicorn.sock file within the /run directory:
file /run/gunicorn.sockStep 14: Testing Socket Activation
Currently, if you’ve only started the gunicorn.socket unit, the gunicorn.service will not be active yet since the socket has not yet received any connections. You can check this by typing:
sudo systemctl status gunicorn
sudo journalctl -u gunicornOutput
○ gunicorn.service - gunicorn daemon
Loaded: loaded (/etc/systemd/system/gunicorn.service; disabled; vendor preset: enabled)
Active: inactive (dead)
TriggeredBy: ● gunicorn.socket
To test the socket activation mechanism, you can send a connection to the socket through curl by typing:
curl --unix-socket /run/gunicorn.sock localhostStep 15: Configure NGINX
NGINX is an incredibly fast and light-weight web server. We will use it to serve up our static files for our Django app.
sudo apt-get install nginxVerify If NGINX is installed. Nginx is a web server that can also be used as a reverse proxy, load balancer, mail proxy and HTTP cache.
nginx -vConfigure Nginx to Proxy Pass to Gunicorn Process
Start by creating and opening a new server block in Nginx’s sites-available directory:
sudo nano /etc/nginx/sites-available/myprojectInside, open up a new server block. You will start by specifying that this block should listen on the normal port 80 and that it should respond to your server’s domain name or IP address:
Sample File
server { listen 80; server_name your_domain.com; # Change this to your actual domain name or IP address location = /favicon.ico { access_log off; log_not_found off; } location /static/ { root /path/to/your/django/project; } location /media/ { root /path/to/your/django/project; } location / { include proxy_params; proxy_pass http://127.0.0.1:8000; # Change the port if Gunicorn is running on a different port proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; } # Additional configurations (SSL, etc.) can be added here error_page 500 502 503 504 /50x.html; location = /50x.html { root /usr/share/nginx/html; } }
server {
listen 80;
server_name server_domain_or_IP;
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
root /home/sammy/myprojectdir;
}
location / {
include proxy_params;
proxy_pass http://unix:/run/gunicorn.sock;
}
}
server {
server_name 194.195.115.231;
access_log on;
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
root /home/mis/Mahaguru-International-School-Backend/backend;
}
location / {
include proxy_params;
proxy_pass http://unix:/run/gunicorn.sock;
}
}
Important: Static Location should be properly set
If django project is with in /opt/myenv/
location /static/ { alias /opt/myenv/mtm_django_app/django_app; }You also need to add the static directory to your
urls.pyfile. Add the following:from django.conf import settings from django.conf.urls.static import static urlpatterns = [ ... ] + static(settings.STATIC_URL, document_root=settings.STATIC_ROOT)Link server block to sites-enables
sudo ln -s /etc/nginx/sites-available/mis_backend /etc/nginx/sites-enabled
Enable the file
Save and close the file when you are finished. Now, you can enable the file by linking it to the sites-enabled directory:
sudo ln -s /etc/nginx/sites-available/myproject /etc/nginx/sites-enabledTest
sudo nginx -tStatus
systemctl status nginx.serviceErrors
sudo tail -f /var/log/nginx/error.logRestart nginx
sudo systemctl restart nginxStep 16: Remove the rule to open port 8000
sudo ufw delete allow 8000
sudo ufw allow 'Nginx Full'Allowed port 5432 for db
ufw allow 5432Restarting
If you update your Django application, you can restart the Gunicorn process to pick up the changes by typing:
sudo systemctl restart gunicornIf you change Gunicorn socket or service files, reload the daemon and restart the process by typing:
sudo systemctl daemon-reload
sudo systemctl restart gunicorn.socket gunicorn.serviceIf you change the Nginx server block configuration, test the configuration and then Nginx by typing:
sudo nginx -t && sudo systemctl restart nginxDebugging & Logs
Check your /etc/systemd/system/gunicorn.service file for problems.
- Check the Nginx process logs by typing:
sudo journalctl -u nginx
- Check the Nginx access logs by typing:
sudo less /var/log/nginx/access.log
- Check the Nginx error logs by typing:
sudo less /var/log/nginx/error.log
- Check the Gunicorn application logs by typing:
sudo journalctl -u gunicorn
- Check the Gunicorn socket logs by typing:
sudo journalctl -u gunicorn.socket
Known Issues
2024/01/12 07:55:11 [error] 60919#60919: *607 connect() failed (111: Unknown error) while connecting to upstream, client: 54.37.79.75, server: 194.195.115.231, request: "POST / HTTP/1.1", upstream: "http://127.0.0.1:8001/", host: "194.195.115.231"
Make sure ip address and port is correct. (Don’t add port number to URL in browser: If you followed above steps)
psycopg2 installation failure
pip install psycopg2-binaryOn a Debian/Ubuntu-based system:
sudo apt-get install libpq-devOn a Red Hat/Fedora-based system:
sudo dnf install postgresql-develKnown Issues: pdfkit not found
Install wkhtmltopdf
sudo apt-get install -y wkhtmltopdfKnown Issues: If pgadmin4 is setup, have to stop apache server
sudo apachectl stopKnown Issues: Allowed Host Not Setup
Edit settings.py to add your_domain_OR_ip to ALLOWED_HOSTS = []
SSL & HTTPS
- Installing Certbot
Certbot recommends using their snap package for installation. Snap packages work on nearly all Linux distributions, but they require that you’ve installed snapd first in order to manage snap packages. Ubuntu 22.04 comes with support for snaps out of the box, so you can start by making sure your snapd core is up to date:
sudo snap install core; sudo snap refresh core⚠️ If you’re working on a server that previously had an older version of certbot installed, you should remove it before going any further:
sudo apt remove certbotInstall the
certbotpackage:sudo snap install --classic certbot
- Link
Finally, you can link the
certbotcommand from the snap install directory to your path, so you’ll be able to run it by just typingcertbot. This isn’t necessary with all packages, but snaps tend to be less intrusive by default, so they don’t conflict with any other system packages by accident:sudo ln -s /snap/bin/certbot /usr/bin/certbot
Step 2 — Confirming Nginx’s Configuration
Certbot needs to be able to find the correct server block in your Nginx configuration for it to be able to automatically configure SSL. Specifically, it does this by looking for a server_name directive that matches the domain you request a certificate for.
If you followed the server block set up step in the Nginx installation tutorial, you should have a server block for your domain at /etc/nginx/sites-available/example.com with the server_name directive already set appropriately.
To check, open the configuration file for your domain using nano or your favorite text editor:
sudo nano /etc/nginx/sites-available/example.comCopy
Find the existing server_name line. It should look like this:
/etc/nginx/sites-available/example.com
...
server_name example.com www.example.com;
...
Copy
If it does, exit your editor and move on to the next step.
If it doesn’t, update it to match. Then save the file, quit your editor, and verify the syntax of your configuration edits:
sudo nginx -t
Copy
If you get an error, reopen the server block file and check for any typos or missing characters. Once your configuration file’s syntax is correct, reload Nginx to load the new configuration:
sudo systemctl reload nginx
Copy
Certbot can now find the correct server block and update it automatically.
Next, let’s update the firewall to allow HTTPS traffic.
Step 3 — Allowing HTTPS Through the Firewall
If you have the ufw firewall enabled, as recommended by the prerequisite guides, you’ll need to adjust the settings to allow for HTTPS traffic. Luckily, Nginx registers a few profiles with ufw upon installation.
You can see the current setting by typing:
sudo ufw status
Copy
It will probably look like this, meaning that only HTTP traffic is allowed to the web server:
Output
Status: active
To Action From
-- ------ ----
OpenSSH ALLOW Anywhere
Nginx HTTP ALLOW Anywhere
OpenSSH (v6) ALLOW Anywhere (v6)
Nginx HTTP (v6) ALLOW Anywhere (v6)
To additionally let in HTTPS traffic, allow the Nginx Full profile and delete the redundant Nginx HTTP profile allowance:
sudo ufw allow 'Nginx Full'
sudo ufw delete allow 'Nginx HTTP'
Copy
Your status should now look like this:
sudo ufw status
Copy
Output
Status: active
To Action From
-- ------ ----
OpenSSH ALLOW Anywhere
Nginx Full ALLOW Anywhere
OpenSSH (v6) ALLOW Anywhere (v6)
Nginx Full (v6) ALLOW Anywhere (v6)
Next, let’s run Certbot and fetch our certificates.
Step 4 — Obtaining an SSL Certificate
Certbot provides a variety of ways to obtain SSL certificates through plugins. The Nginx plugin will take care of reconfiguring Nginx and reloading the config whenever necessary. To use this plugin, type the following:
sudo certbot --nginx -dexample.com -dwww.example.comCopy
This runs certbot with the --nginx plugin, using -d to specify the domain names we’d like the certificate to be valid for.
When running the command, you will be prompted to enter an email address and agree to the terms of service. After doing so, you should see a message telling you the process was successful and where your certificates are stored:
Output
IMPORTANT NOTES:
Successfully received certificate.
Certificate is saved at: /etc/letsencrypt/live/your_domain/fullchain.pem
Key is saved at: /etc/letsencrypt/live/your_domain/privkey.pem
This certificate expires on 2022-06-01.
These files will be updated when the certificate renews.
Certbot has set up a scheduled task to automatically renew this certificate in the background.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
If you like Certbot, please consider supporting our work by:
* Donating to ISRG / Let's Encrypt: https://letsencrypt.org/donate
* Donating to EFF: https://eff.org/donate-le
Your certificates are downloaded, installed, and loaded, and your Nginx configuration will now automatically redirect all web requests to https://. Try reloading your website and notice your browser’s security indicator. It should indicate that the site is properly secured, usually with a lock icon. If you test your server using the SSL Labs Server Test, it will get an A grade.
Let’s finish by testing the renewal process.
Step 5 — Verifying Certbot Auto-Renewal
Let’s Encrypt’s certificates are only valid for ninety days. This is to encourage users to automate their certificate renewal process. The certbot package we installed takes care of this for us by adding a systemd timer that will run twice a day and automatically renew any certificate that’s within thirty days of expiration.
You can query the status of the timer with systemctl:
sudo systemctl status snap.certbot.renew.service
Copy
Output
○ snap.certbot.renew.service - Service for snap application certbot.renew
Loaded: loaded (/etc/systemd/system/snap.certbot.renew.service; static)
Active: inactive (dead)
TriggeredBy: ● snap.certbot.renew.timer
To test the renewal process, you can do a dry run with certbot:
sudo certbot renew --dry-run
Copy
If you see no errors, you’re all set. When necessary, Certbot will renew your certificates and reload Nginx to pick up the changes. If the automated renewal process ever fails, Let’s Encrypt will send a message to the email you specified, warning you when your certificate is about to expire
Run
python3 manage.py runserverSpecify port:
python manage.py runserver 8002DB Setup
settings.py contains database configuration.
SQLite
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}Postgres
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'msigma_db',
'USER': 'postgres',
'PASSWORD': 'admin',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}python3 manage.py migrateDjango Superuser
python manage.py createsuperuserVisit http://127.0.0.1:8000/admin/login/?next=/admin/login to login using credentials used in above step.
createsuperuser command will create an user in the user table in the database provided.
Django Style Guide
The core of the Django Styleguide can be summarized as follows:
In Django, business logic should live in:
- Services - functions, that mostly take care of writing things to the database.
- Selectors - functions, that mostly take care of fetching things from the database.
- Model properties (with some exceptions).
- Model
cleanmethod for additional validations (with some exceptions).
In Django, business logic should not live in:
- APIs and Views.
- Serializers and Forms.
- Form tags.
- Model
savemethod.
- Custom managers or querysets.
- Signals.
Model properties vs selectors:
- If the property spans multiple relations, it should better be a selector.
- If the property is non-trivial & can easily cause
N + 1queries problem, when serialized, it should better be a selector.
The general idea is to "separate concerns" so those concerns can be maintainable / testable.
Sample README.md file
# Msigma Entity Resource Planning Service
This is the backend service for an **Entity Resource Planning (ERP)** system built with Django and Django REST Framework (DRF). This backend handles various business functions, such as managing resources, inventory, orders, and users.
## Table of Contents
- [Project Overview](#project-overview)
- [Features](#features)
- [Tech Stack](#tech-stack)
- [Getting Started](#getting-started)
- [Configuration](#configuration)
- [Database Migrations](#database-migrations)
- [Running the Application](#running-the-application)
- [Testing](#testing)
- [API Documentation](#api-documentation)
- [Contributing](#contributing)
## Project Overview
The ERP Backend is designed to provide comprehensive management functionality to enable seamless operations for businesses. Key features include product management, inventory management, purchase order processing, finance management, user management, and role-based access control.
## Features
- **Product Management**: Add and manage products with category and uom (unit of measurement)
- **Inventory Tracking**: Track and manage items, stock levels, and supplier data.
- **Purchase Order Management**: Create, update, and track orders. Create queries for purchase order if any issue arises.
- **Finance Management**: Raise invoices for purchase orders, update status.
- **User and Role Management**: Authenticate users with JWT, manage user roles, and handle permissions. Users are part of groups with various access levels and product and/or inventory ownership
- **Email Communications**: Email notifications send for various actions and raised queries.
## Tech Stack
- **Python**: Main programming language
- **Django**: Web framework
- **Django RQ**: Worker Queue
- **Django REST Framework (DRF)**: API development
- **PostgreSQL**: Database
- **JWT**: Authentication
- **Sendgrid**: Email Service
- **Docker**: Containerization for development and deployment
- **Logtail**: Log aggregation
- **Coverage**: Testing Code Coverage
## Getting Started
### Prerequisites
- **Python 3.10+**
- **Poetry**
- **PostgreSQL**
- **Docker** (for containerized development)
### Clone the Repository
```bash
git clone https://gt.mgsigma.net/erpnext/msigma-erp-service.git
cd msigma-erp-service
```
### Install Dependencies
Install the required Python packages with:
```bash
poetry install
```
## Configuration
### Environment Variables
Create a .env.local file in the root directory with the following variables:
```plaintext
HOST_API_PORT=8001
DEBUG=0
SECRET_KEY=<KEY_HERE>
ENVIRONMENT=local
POSTGRES_DB=msigma-erp-db
POSTGRES_USER=postgres
POSTGRES_PASSWORD=<PASSWORD_HERE>
POSTGRES_HOST=127.0.0.1
POSTGRES_PORT=5432
DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 0.0.0.0
CSRF_TRUSTED_ORIGINS=http://127.0.0.1:1337 http://127.0.0.1:8000 http://localhost:3000 http://127.0.0.1:3000
CORS_ALLOWED_ORIGINS=http://127.0.0.1:3000
NGINX_PORT=1337
QUEUE_HOST=localhost
QUEUE_PORT=11000
QUEUE_DB_INDEX=0
QUEUE_TIMEOUT=360
SENDGRID_API_KEY=<KEY_HERE>
SENDGRID_FROM_EMAIL=<FROM_EMAIL_HERE>
SENDGRID_PO_EMAIL_TEMPLATE=<ID_HERE>
SENDGRID_PO_QUERY_EMAIL_TEMPLATE=<ID_HERE>
SENDGRID_STOCK_EMAIL_TEMPLATE=<ID_HERE>
LOGTAIL_SOURCE_TOKEN=<TOKEN_HERE>
```
You may adjust the variables based on your environment.
## Database Migrations
Run migrations to set up the database schema:
```bash
poetry run python manage.py migrate
```
## Running the Application
Start the Django development server:
```bash
poetry run python manage.py runserver
```
By default, the server will run on http://127.0.0.1:8000.
## Testing
### Run tests
```bash
poetry run python manage.py test
```
This project uses Django Test Cases with JWT token authentication for secured endpoints.
### Coverage
Check coverage:
```bash
poetry run coverage run manage.py test
```
Check coverage of particular app:
```bash
poetry run coverage run --source=<app_name> manage.py test <app-name>
```
Generate Report:
```bash
poetry run coverage report -m
```
## API Documentation
API documentation is available with Django REST Framework's browsable API. Access it at:
`http://127.0.0.1:8000/api/schema/swagger-ui/`
after starting the server.
## Docker Setup
To build and run the project with Docker:
### Build and up the Docker image
```bash
docker compose -f docker-compose.dev.yml up
```
The application will be available on http://127.0.0.1:1337.
## Contributing
- Fork the project.
- Create your feature branch (git checkout -b feature/YourFeature).
- Commit your changes (git commit -m 'Add YourFeature').
- Push to the branch (git push origin feature/YourFeature).
- Open a pull request.
## License
This project is licensed under the MIT License - see the LICENSE file for details.
Apps
Create
Each app should do one thing (cart should manage cart related stuff etc).
python manage.py startapp products
python manage.py startapp cartThese commands will create apps in that particular name in src folder.
Include
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
# ...
# third party
# own
'products',
]Model
Define
Simple Model
from django.db import models
class Product(models.Model):
title = models.TextField()
description = models.TextField()
price = models.TextField()Following field types can be used:

Model With Verification
class Product(models.Model):
title = models.CharField(max_length=120)
description = models.TextField(blank=True, null=True)
price = models.DecimalField(decimal_places=2, max_digits=10000)
summary = models.TextField(blank=False, null=False)
featured = models.BooleanField(default=False)blank is for validation for field and null is for database.
default will be applied for old rows in table if the field is new and also for new rows if not value if provided.
Example Model
class Base(models.Model):
class Extend(Enum):
CRASH = 'CRASH'
FULL = 'FULL'
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
name = models.CharField(max_length=50, blank=True, null=False)
description = models.CharField(max_length=500, blank=True, null=True)
type = models.CharField(max_length=10, blank=True, null=False)
year = models.IntegerField(default=0, null=True)
status = models.CharField(max_length=10, blank=True, null=False)
university = models.ForeignKey(
University, on_delete=models.CASCADE, null=True)
branch = models.ForeignKey(Branch, on_delete=models.CASCADE, null=True)
schema = models.ForeignKey(Schema, on_delete=models.CASCADE, null=True)
version = models.ForeignKey(Version, on_delete=models.CASCADE, null=True)
batch = models.ForeignKey(Batch, on_delete=models.CASCADE, null=True)
extend = models.CharField(max_length=50, choices=[
(tag.name, tag.value) for tag in Extend])
created_at = models.DateTimeField(auto_now=False, auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True, auto_now_add=False)
class MyModel(models.Model):
boolean_field = models.BooleanField()
char_field = models.CharField(max_length=100)
date_field = models.DateField()
datetime_field = models.DateTimeField()
decimal_field = models.DecimalField(max_digits=5, decimal_places=2)
duration_field = models.DurationField()
email_field = models.EmailField()
file_field = models.FileField(upload_to='uploads/')
float_field = models.FloatField()
image_field = models.ImageField(upload_to='images/')
integer_field = models.IntegerField()
many_to_many_field = models.ManyToManyField(MyOtherModel)
one_to_many_field = models.ForeignKey(
MyOtherModel, on_delete=models.CASCADE)
positive_integer_field = models.PositiveIntegerField()
positive_small_integer_field = models.PositiveSmallIntegerField()
slug_field = models.SlugField()
small_integer_field = models.SmallIntegerField()
text_field = models.TextField()
time_field = models.TimeField()
url_field = models.URLField()
uuid_field = models.UUIDField()
# Generic foreign key field ** Requires additional setup
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
object_id = models.PositiveIntegerField()
content_object = GenericForeignKey('content_type', 'object_id')Slug Field
Store user friendly URLs: Instead of having a URL like example.com/posts/123, a slug can be used to create a URL like example.com/posts/my-first-post
from django.db import models
class Post(models.Model):
title = models.CharField(max_length=100)
slug = models.SlugField()
def save(self, *args, **kwargs):
# Generate a slug from the title
self.slug = slugify(self.title)
super(Post, self).save(*args, **kwargs)On Delete Methods
CASCADE: When the referenced object is deleted, also delete the objects that have a foreign key to it. This is the most common choice forForeignKeyandOneToOneFieldrelationships. It ensures that no orphaned records are left in the database.
PROTECT: Prevent deletion of the referenced object by raising aProtectedErrorif any objects still reference it.
SET_NULL: Set the foreign key toNULLwhen the referenced object is deleted. This option is only applicable to fields that can be nullable (e.g.,null=Trueon the field definition).
SET_DEFAULT: Set the foreign key to its default value when the referenced object is deleted. This option requires a default value to be specified on the field.
SET(): Set the foreign key to the value passed to theSET()argument when the referenced object is deleted. This option allows you to specify a callable or a model method to determine the value.
DO_NOTHING: Do nothing when the referenced object is deleted. You are responsible for handling the situation manually to maintain referential integrity.
Apply
python manage.py makemigrations
python manage.py migratemakemigrations creates migrations.
migrate applies created migrations.
Add Model To Admin
from .models import Product
admin.site.register(Product)Model will now be visible in Django Admin Panel
Ways to Extend the Existing User Model
Generally speaking, there are four different ways to extend the existing User model. Read below why and when to use them.
Option 1: Using a Proxy Model
What is a Proxy Model?
It is a model inheritance without creating a new table in the database. It is used to change the behaviour of an existing model (e.g. default ordering, add new methods, etc.) without affecting the existing database schema.
When should I use a Proxy Model?
You should use a Proxy Model to extend the existing User model when you don’t need to store extra information in the database, but simply add extra methods or change the model’s query Manager.
That’s what I need! Take me to the instructions.
Option 2: Using One-To-One Link With a User Model (Profile)
What is a One-To-One Link?
It is a regular Django model that’s gonna have it’s own database table and will hold a One-To-One relationship with the existing User Model through a OneToOneField.
When should I use a One-To-One Link?
You should use a One-To-One Link when you need to store extra information about the existing User Model that’s not related to the authentication process. We usually call it a User Profile.
That’s what I need! Take me to the instructions.
Option 3: Creating a Custom User Model Extending AbstractBaseUser
What is a Custom User Model Extending AbstractBaseUser?
It is an entirely new User model that inherit from AbstractBaseUser. It requires a special care and to update some references through the settings.py. Ideally it should be done in the beginning of the project, since it will dramatically impact the database schema. Extra care while implementing it.
When should I use a Custom User Model Extending AbstractBaseUser?
You should use a Custom User Model when your application have specific requirements in relation to the authentication process. For example, in some cases it makes more sense to use an email address as your identification token instead of a username.
That’s what I need! Take me to the instructions.
Option 4: Creating a Custom User Model Extending AbstractUser
What is a Custom User Model Extending AbstractUser?
It is a new User model that inherit from AbstractUser. It requires a special care and to update some references through the settings.py. Ideally it should be done in the beginning of the project, since it will dramatically impact the database schema. Extra care while implementing it.
When should I use a Custom User Model Extending AbstractUser?
You should use it when you are perfectly happy with how Django handles the authentication process and you wouldn’t change anything on it. Yet, you want to add some extra information directly in the User model, without having to create an extra class (like in the Option 2).
That’s what I need! Take me to the instructions.
Fields in User Model (django.contrib.auth.models.User)
username: A unique identifier for the user.
password: The hashed and encrypted password.
first_name: The first name of the user (optional).
last_name: The last name of the user (optional).
email: The email address of the user (optional).
is_staff: A boolean field indicating whether the user is a staff member (e.g., has access to the admin site).
is_active: A boolean field indicating whether the user account is active.
date_joined: The date and time when the user account was created.
last_login: The date and time of the user's last login.
Signals
@receiver(post_save, sender=Participant)
def create_ticket(sender, instance, created, **kwargs):
"""
A signal handler function to create a Ticket when a Participant is created.
"""
if created: # Only create a Ticket when a Participant is newly created
key = get_random_string(length=10)
Ticket.objects.create(key=key, participant=instance)# In your app's apps.py or a relevant module
from django.apps import AppConfig
class YourAppConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'your_app' # Replace with your actual app name
def ready(self):
import your_app.signals # Replace with the actual name of your signals moduleWithout Using Signals
from django.db import models
from django.utils.crypto import get_random_string
class ParticipantManager(models.Manager):
def create_participant_with_ticket(self, **kwargs):
key = get_random_string(length=10)
participant = self.create(**kwargs)
ticket = Ticket(key=key, participant=participant)
ticket.save()
return participant
class Participant(models.Model):
# Your Participant fields here
objects = ParticipantManager()participant = Participant.objects.create_participant_with_ticket(name='John', email_alpha='john@example.com')Lecture (15 Minutes)
Title: Understanding Django Signals
Introduction (1 minute):
- Explain the importance of Django signals in web development.
- Outline what will be covered in the lecture.
What Are Django Signals? (2 minutes):
- Django signals are a mechanism for allowing various parts of a Django application to communicate with each other.
- They provide a way for decoupled applications to get notified when certain actions occur elsewhere in the application.
Why Use Signals? (2 minutes):
- Signals enable loose coupling between different components of your application, making it more modular and maintainable.
- They allow you to extend and customize Django's behavior without modifying core code.
- Signals are widely used for implementing various functionalities and triggering background tasks (outside of main django thread).
- Sending emails
User Registration: Sending a welcome email to a user after registration.
- Cache Invalidation & Updation:
- Signals can be used to invalidate a cache when an object is updated.
- Use the
po st_savesignal to automatically clear cache when a model instance is saved.
- Logging: Illustrate how signals can be used for logging important events in your application.
- Sending emails
Basic Concepts (3 minutes):
- Signals involve senders and receivers.
- Senders:
- Objects that send signals when certain events happen.
- The sender is the object that sends or emits the signal.
- When an event occurs within the sender object (e.g., a model instance is saved), it can trigger the associated signal.
- Receivers
- A receiver function is a Python function or method that handles the signal when it is emitted.
- It performs specific actions or logic in response to the signal.
- Receiver functions are also known as signal handlers.
- Django provides some built-in signals and allows you to create custom signals. (More about that later)
- pre_save: Sent just before a model's
save()method is called. This signal is often used for performing actions or validations before a model is saved to the database.
- post_save: Sent just after a model's
save()method is called and the object is saved to the database. This signal is commonly used for performing actions after an object is saved, such as sending notifications or updating related records.
- pre_delete: Sent just before a model's
delete()method is called. This signal is useful for performing actions or validations before a model is deleted from the database.
- post_delete: Sent just after a model's
delete()method is called and the object is deleted from the database. This signal is often used for cleanup tasks or related actions after an object is deleted.
- m2m_changed: Sent when a ManyToMany relationship is changed (e.g., objects are added or removed from the relationship). This signal is useful for reacting to changes in ManyToMany relationships.
- pre_init: Sent when an instance of a model is created using its constructor but before any field values are assigned. This signal can be used to perform custom initialization before object creation.
- post_init: Sent when an instance of a model is created using its constructor and after all field values are assigned. This signal can be used for additional setup or validation of object attributes.
- class_prepared: Sent after a model's class has been fully prepared (e.g., fields and methods are defined). This signal is useful for dynamically adding or modifying attributes of a model class.
- request_started: Sent at the beginning of an HTTP request. This signal can be used to perform actions when a request is received.
- request_finished: Sent at the end of processing an HTTP request, just before sending the response. This signal can be used for cleanup tasks after request processing.
- got_request_exception: Sent when an exception is raised during request processing. This signal is helpful for logging or handling exceptions globally.
- setting_changed: Sent when a Django setting is changed. This signal can be used to react to changes in application settings.
- pre_save: Sent just before a model's
- Signal instances: Represent specific signals that can be connected to sender objects and associated with one or more receiver functions
Here's an explanation of key concepts related to Signal instances in Django:
- Signal Class (
django.dispatch.Signal): TheSignalclass is used to define a new signal. It serves as a blueprint for creating instances of signals that represent specific events or actions in your application.
- Connecting Signals: To associate a signal with a sender and a receiver function, you use the
Signal.connect()method. This establishes a connection between the signal and the function, indicating that when the signal is sent by the sender, the connected function should be executed.
- Disconnecting Signals: You can also disconnect a signal from a receiver function using the
Signal.disconnect()method. This breaks the connection between the signal and the function.
- Signal Sending (
Signal.send()): To trigger a signal, you use theSignal.send()method. This method sends the signal to all connected receiver functions, allowing them to execute in response to the event.
- Signal Class (
Demo:
from django.dispatch import Signal # Define a Signal instance my_signal = Signal() # Define a receiver function def my_receiver(sender, **kwargs): print("Signal received from sender:", sender) # Connect the signal to the receiver function my_signal.connect(my_receiver) # Somewhere in your code, trigger the signal sender_object = SomeModel() # Replace with your sender object my_signal.send(sender=sender_object)By using
@receiverDecorator: Both.connect()and@receiverare used to connect signal handlers (receiver functions) to signals, but they have slightly different use cases and syntax..connect()Method:.connect()is a method of theSignalclass and is used to explicitly connect a signal to a receiver function.
- Usage: Connect a signal in a separate module / a signal conditionally.
- Usage: Additional parameters for the connection, such as
dispatch_uidfor controlling the order of execution of signal handlers.
@receiverDecorator (Python decorator provided by Django):- Connect a signal to a receiver function by decorating the function directly.
- It is often used when you want to keep the signal connection code close to the receiver function.
- It is also used when you don't need to specify additional parameters for the connection.
from django.dispatch import receiver @receiver(my_signal, sender=SomeModel) def my_receiver(sender, **kwargs): # Your signal handling logic here
Cache Set & Invalidation Demo
Best Practices and Tips (2 minutes):
- Explain some best practices when working with signals:
- Document your signals and their intended use.
- Avoid tight coupling by using signals sparingly.
- Consider using Django's built-in signals when they meet your needs.
Common Pitfalls (1 minute):
- Mention some common mistakes or issues to watch out for when using signals.
- Infinite Loops: Be cautious when using signals to modify the same model that triggers the signal. This can lead to infinite loops where signals keep triggering each other. Workaround: You can use conditional checks.
- Order of Signal Handlers: The order in which signal handlers are executed is not guaranteed. If the order of execution matters in your application, consider using the
@receiver'sdispatch_uidparameter to specify a unique identifier and control the execution order.
- Signal Disconnects: Be mindful of disconnecting signals. Disconnecting signals without a good reason can lead to unexpected behavior, as certain functionality that relies on signals may not work as expected.
- Database Queries: Avoid making unnecessary database queries within signal handlers, especially inside loops. Excessive database queries can degrade performance. Use select_related or prefetch_related to optimize database queries if needed.
- Circular Imports: Circular import issues can occur when you import a model in a signal handler and the model also imports the module with the signal handler. To avoid this, consider using Django's
get_modelfunction to load models lazily.
- Failing to Dispatch Signals: Django's built-in signals are dispatched automatically, but custom signals need to be explicitly sent using the
send()method. Failing to dispatch a custom signal will result in the signal not being processed.
- Signal Overuse: Overusing signals can make the codebase complex and hard to follow. Be judicious in your use of signals, and consider whether a simpler, direct approach might be more appropriate.
- Testing: Don't forget to test your signal handlers. Ensure that they work as expected under various scenarios, including edge cases and error conditions. Writing tests for signals is essential for robustness.
- Performance Impact: Signals can have a performance impact, especially if they're handling complex operations or if there are a large number of connected signal handlers. Profile your application to identify performance bottlenecks related to signals.
- Documentation: Maintain clear and well-documented signal handling code. Include comments explaining why signals are used and what each signal handler does. This helps other developers understand the purpose and behavior of the signals.
- Deprecation and Compatibility: Be aware that signal behavior and API can change between Django versions. When upgrading Django, review the release notes and ensure your signal handling code is still compatible.
- Signal Handlers in Migrations: Avoid using signal handlers in migrations, as they can lead to unexpected behavior during database schema changes. Use data migrations or post-migrate signals if you need to perform data-related tasks during migrations.
- Importance of testing signal handling functions.
Conclusion (1 minute):
- Recap
Revert Migrations
You can revert by migrating to the previous migration.
For example, if your last two migrations are:
0010_previous_migration
0011_migration_to_revert
Then you would do:
./manage.py migrate my_app 0010_previous_migration
You don't actually need to use the full migration name, the number is enough, i.e.
./manage.py migrate my_app 0010
You can then delete migration 0011_migration_to_revert.
If you're using Django 1.8+, you can show the names of all the migrations with
./manage.py showmigrations my_app
To reverse all migrations for an app, you can run:
./manage.py migrate my_app zeroDjango Shell
Open
python manage.py shellThis will open a shell with django functionalities.
List Objects in Model
from products.models import Product
Product.objects.all()Create
Product.objects.create(title='New', description='new item', price='324')Pages & Views
Create
python manage.py startapp pagesSetup
URL Patterns & Routing
from django.http import HttpResponse
from django.shortcuts import render
def home_view(*args, **kwargs):
return HttpResponse("<h1>Hello World</h1>")
def about_view(*args, **kwarge):
return HttpResponse("<h1>About Page</h1>")from django.contrib import admin
from django.urls import path
from pages.views import home_view
from pages.views import about_view
urlpatterns = [
path('', home_view, name='home'),
path('about/', about_view),
path('admin/', admin.site.urls),
]
Django Template
Create
<h1>HELLO WORLD</h1> <p>Template File</p>Setup
Change views.py to include request and render
from django.http import HttpResponse
from django.shortcuts import render
def home_view(request, *args, **kwargs):
return render(request, "home.html", {})Add path to templates in templates array is settings.py
from pathlib import Path
BASE_DIR = Path(__file__).resolve().parent.parent
# ...
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [Path.joinpath(BASE_DIR, 'templates')],
#...
},
]REST API
Create & Include
python manage.py startapp apiINSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
#...
'rest_framework',
'api',
]Setup
Setup API Views (or routes):
from django.http import JsonResponse
def api_home(request, *args, **kwargs):
return JsonResponse({"message": "Hello, World!"}, status=200)Add API routes in urls:
from django.urls import path
from . import views
urlpatterns = [
path('', views.api_home, name='api_home'),
]Add API urls in project urls:
from django.contrib import admin
from django.urls import path, include
urlpatterns = [
path('admin/', admin.site.urls),
path('api/', include('api.urls')),
]Request & Data
Return data, headers and content_type back as response.
from django.http import JsonResponse
import json
def api_home(request, *args, **kwargs):
body = request.body # byte string of JSON data
data = {}
try:
data = json.loads(body)
except:
pass
data['params'] = request.GET # contains query params
data['headers'] = dict(request.headers)
data['content_type'] = request.content_type
return JsonResponse(data, status=200)request.GETGET
Send GET request to: http://localhost:8002/api/
Model Serializers
from rest_framework import serializers
from .models import Product
class ProductSerializer(serializers.ModelSerializer):
class Meta:
model = Product
fields = [
'title',
'content',
'price',
'sale_price',
]Add Foreign Key fields to Response in ModelViewSet view:
class TimetableSerializer(serializers.ModelSerializer):
grade_name = serializers.ReadOnlyField(source='grade.name')
class Meta:
model = Timetable
fields = '__all__'class ItemSerializer(serializers.ModelSerializer):
stock = serializers.SerializerMethodField()
created_by = serializers.SerializerMethodField()
class Meta:
model = Item
fields = "__all__"
def get_created_by(self, obj):
created_by_username = f"{obj.created_by.first_name} {obj.created_by.last_name}"
return created_by_username
def get_stock(self, obj):
stock = ItemStock.objects.filter(item=obj).order_by("-created_at").first()
return ItemStockSerializer(stock).data if stock else None
Serializer Custom Error Message
from rest_framework import serializers
from .models import BtechToken
def validate_origin(value):
available_types = [choice[0] for choice in BtechToken.OriginType.choices]
if value not in available_types:
raise serializers.ValidationError(
f"Invalid origin type. Available types: {', '.join(available_types)}"
)
class BtechTokenSerializer(serializers.ModelSerializer):
origin = serializers.CharField(max_length=100, validators=[validate_origin])
class Meta:
model = BtechToken
fields = "__all__"
REST Framework View & Response
Import required model and use . methods to get data from them.
from rest_framework.response import Response
from rest_framework.decorators import api_view
from products.models import Product
from products.serializers import ProductSerializer
@api_view(['GET'])
def api_home(request, *args, **kwargs):
instance = Product.objects.all().order_by("?").first()
data = {}
if instance:
data = ProductSerializer(instance).data
return Response(data, status=200)GET methods are allowed. If more methods to be allowed, use: @api_view(['GET', ‘POST’]) etc.Get Properties and Methods
from django.db import models
class Product(models.Model):
title = models.CharField(max_length=120)
content = models.TextField(blank=True, null=True)
price = models.DecimalField(decimal_places=2, max_digits=15, default=99.99)
@property
def sale_price(self):
return "%.2f" % (float(self.price) * 0.8)
def get_discount(self):
return "122"from rest_framework import serializers
from .models import Product
class ProductSerializer(serializers.ModelSerializer):
my_discount = serializers.SerializerMethodField(read_only=True)
class Meta:
model = Product
fields = [
'title',
'content',
'price',
'sale_price',
'my_discount'
]
def get_my_discount(self, obj):
print(obj.id)
# obj.user => access user in obj
return obj.get_discount()Here get_discount function is renamed to my_discount and this will be visible in api response.
Generics RetrieveAPIView
from rest_framework import generics
from .models import Product
from .serializers import ProductSerializer
class ProductDetailAPIView(generics.RetrieveAPIView):
queryset = Product.objects.all()
serializer_class = ProductSerializer
# lookup_field = 'pk'
product_detail_view = ProductDetailAPIView.as_view()from django.urls import path
from . import views
urlpatterns = [
path('<int:pk>/',views.product_detail_view)
]from django.contrib import admin
from django.urls import path, include
urlpatterns = [
path('admin/', admin.site.urls),
path('api/', include('api.urls')),
path('api/products/', include('products.urls')),
]Generics ListAPIView
class ProductListAPIView(generics.RetrieveAPIView):
queryset = Product.objects.all()
serializer_class = ProductSerializer
product_list_view = ProductListAPIView.as_view()Pagination
REST_FRAMEWORK = {
'DEFAULT_PAGINATION_CLASS': 'rest_framework.pagination.PageNumberPagination',
'PAGE_SIZE': 100
}Offset/Limit Paginations vs Cursor Based Pagination
from rest_framework.pagination import CursorPagination, PageNumberPaginationCursor pagination is most often used for real-time data due to the frequency new records are added and because when reading data you often see the latest results first. There different scenarios in which offset and cursor pagination make the most sense so it will depend on the data itself and how often new records are added. When querying static data, the performance cost alone may not be enough for you to use a cursor, as the added complexity that comes with it may be more than you need.
Pagination is a solution to this problem that ensures that the server only sends data in small chunks. Cursor-based pagination is our recommended approach over numbered pages, because it eliminates the possibility of skipping items and displaying the same item more than once. In cursor-based pagination, a constant pointer (or cursor) is used to keep track of where in the data set the next items should be fetched from.
If you ask for page1 using PageNumberPagination, and after that a new item is added to the list, and you then ask for page2, the last item you just got in page1 will be shown again as the first item in page2 (shown 2 times in a row). CursorPagination is a much more recommended way than PageNumberPagination. CursorPagination keeps reference to objects, and does not have to calculate content for each page. For implementation see stackoverflow.com/a/47657610/5881884
POST
Send POST request to: http://localhost:8002/api/
Serializer
from rest_framework import serializers
from .models import Product
class ProductSerializer(serializers.ModelSerializer):
my_discount = serializers.SerializerMethodField(read_only=True)
class Meta:
model = Product
fields = [
'title',
'content',
'price',
'sale_price',
'my_discount'
]
def get_my_discount(self, obj):
try:
return obj.get_discount()
except:
return NoneREST View
Serializer will raise exception when request data fails validation.
from rest_framework.response import Response
from rest_framework.decorators import api_view
from products.models import Product
from products.serializers import ProductSerializer
@api_view(['POST'])
def api_home(request, *args, **kwargs):
serializer = ProductSerializer(data=request.data)
if serializer.is_valid(raise_exception=True):
return Response(serializer.data, status=200)
Generics CreateAPIView
class ProductCreateAPIView(generics.CreateAPIView):
queryset = Product.objects.all()
serializer_class = ProductSerializer
def perform_create(self, serializer):
# serializer.save(user=self.request.user)
title = serializer.validated_data.get('title')
content = serializer.validated_data.get('content')
if content is None:
content = title
serializer.save(content=content)
product_create_view = ProductCreateAPIView.as_view()urlpatterns = [
path('',views.product_create_view),
path('<int:pk>/',views.product_detail_view)
]POST & GET
Generics ListCreateAPIView
POST & GET methods to the endpoint will create or list all objects according to the method of request using same view.
class ProductListCreateAPIView(generics.ListCreateAPIView):
queryset = Product.objects.all()
serializer_class = ProductSerializer
def perform_create(self, serializer):
# serializer.save(user=self.request.user)
title = serializer.validated_data.get('title')
content = serializer.validated_data.get('content')
if content is None:
content = title
serializer.save(content=content)
product_list_create_view = ProductListCreateAPIView.as_view()APIView
transaction.atomic to wrap the save operation, which ensures that either all the operations succeed, or none of them do (rollback)
class ItemStockAddView(APIView):
permission_classes = [IsAuthenticated]
def post(self, request, format=None):
with transaction.atomic():
stock_serializer = ItemStockSerializer(data=request.data)
if stock_serializer.is_valid():
item_id = request.data.get("item_id")
if item_id is None:
raise serializers.ValidationError("item_id missing")
item = ProductItem.objects.get(id=item_id)
stock = stock_serializer.save(item=item, created_by=self.request.user)
quantities = request.data.get("quantities")
if quantities is None:
raise serializers.ValidationError("quantities missing")
for quantity in quantities:
quantity_serializer = ItemQuantitySerializer(data=quantity)
if quantity_serializer.is_valid():
quantity_serializer.save(
item_stock=stock, created_by=self.request.user
)
else:
raise serializers.ValidationError(quantity_serializer.errors)
return Response(
{"stock": stock_serializer.data}, status=status.HTTP_201_CREATED
)
return Response(stock_serializer.errors, status=status.HTTP_400_BAD_REQUEST)
PUT
Generics UpdateAPIView
class ProductUpdateAPIView(generics.UpdateAPIView):
queryset = Product.objects.all()
serializer_class = ProductSerializer
lookup_field = 'pk'
def perform_update(self, serializer):
instance = serializer.save()
if not instance.content:
instance.content = instance.title
product_update_view = ProductUpdateAPIView.as_view()URLS
from django.urls import path
from . import views
urlpatterns = [
path('',views.product_list_create_view),
path('<int:pk>/', views.product_detail_view),
path('<int:pk>/update/', views.product_update_view),
]
Class For CRUD
Generics View vs ModelViewSet
viewsets.ModelViewSet and generics.ListAPIView are both classes provided by Django Rest Framework (DRF) for building API views, but they serve slightly different purposes. Let's compare them:
viewsets.ModelViewSet:viewsets.ModelViewSetis a class that combines several common actions related to a Django model into a single view. It's typically used when you want to create a complete CRUD (Create, Read, Update, Delete) API for a Django model. It provides a set of predefined actions that correspond to common HTTP methods:GET: Retrieve a list of instances or a single instance.
POST: Create a new instance.
PUTorPATCH: Update an existing instance.
DELETE: Delete an instance.
Key points:
- Provides multiple actions for CRUD operations.
- Automatically generates URLs for various actions.
- Requires defining a serializer and queryset for the model.
Example usage:
pythonCopy code from rest_framework import viewsets from .models import MyModel from .serializers import MyModelSerializer class MyModelViewSet(viewsets.ModelViewSet): queryset = MyModel.objects.all() serializer_class = MyModelSerializer
generics.ListAPIView:generics.ListAPIViewis a class that provides a read-only view for listing instances of a Django model. It's typically used when you only need to retrieve a list of instances without supporting other CRUD operations. It provides only theGETaction to retrieve a list.Key points:
- Limited to listing instances (read-only).
- Requires defining a serializer and queryset for the model.
- Doesn't automatically generate URLs for other actions.
Example usage:
pythonCopy code from rest_framework import generics from .models import MyModel from .serializers import MyModelSerializer class MyModelListView(generics.ListAPIView): queryset = MyModel.objects.all() serializer_class = MyModelSerializer
In summary, if you're building a complete CRUD API for a model, viewsets.ModelViewSet is a convenient choice as it provides actions for all CRUD operations. On the other hand, if you only need a read-only list view, generics.ListAPIView is more suitable. Choose the appropriate class based on your specific API requirements.
Middlewares
Order Of Middleware
Each middleware has before and after part to it.

CSRF Exempt For Views
Exemption for Session Authentication
Some views like LoginView can be exempt from CSRF check since it is the first view called to return CSRF token to be used for authentication of all subsequent request.
Session Authentication (usually used for Web views) to Token Authentication for API views.from django.http import HttpResponse
from django.views.decorators.csrf import csrf_exempt
@csrf_exempt
def my_view(request):
return HttpResponse("Hello world")
class LoginView(APIView):
# Setting attribute to request to avoid csrf check
def initialize_request(self, request, *args, **kwargs):
setattr(request, 'csrf_processing_done', True)
return super().initialize_request(request, *args, **kwargs)
def post(self, request, format=None):
File Uploads
Documentation
DRF Spectacular
To implement drf_spectacular in a Django project, you can follow these steps:
- Install
drf_spectacular: You can install it via pip:Copy code pip install drf-spectacular
- Add
drf_spectaculartoINSTALLED_APPS: Add'drf_spectacular'to theINSTALLED_APPSlist in yoursettings.pyfile:pythonCopy code INSTALLED_APPS = [ ... 'drf_spectacular', ... ]
- Include
drf_spectacularURLs: Includedrf_spectacularURLs in your project's URLconf. You can do this by adding the following to your project'surls.pyfile:pythonCopy code urlpatterns = [ ... path('api/schema/', SpectacularAPIView.as_view(), name='schema'), path('api/schema/swagger-ui/', SpectacularSwaggerView.as_view(url_name='schema'), name='swagger-ui'), path('api/schema/redoc/', SpectacularRedocView.as_view(url_name='schema'), name='redoc'), ... ]
- Generate API Schema:
drf_spectacularprovides management commands to generate API schemas. Run the following command to generate the schema:cssCopy code python manage.py spectacular --file schema.ymlThis command will generate a YAML file named
schema.ymlcontaining the API schema.
- Use the Schema: With the schema generated, you can now access it using the specified URLs (
/api/schema/,/api/schema/swagger-ui/,/api/schema/redoc/) in your browser. This will give you access to interactive documentation (Swagger UI or ReDoc) for your API.
Advanced Concepts
Reapply Migrations When Tables Are Deleted
python manage.py migrate --fake app_name zerodjango-mptt
MPTT
Modified Preorder Tree Traversal is a technique for storing hierarchical data in a database.
The aim is to make retrieval operations very efficient.
The trade-off for this efficiency is that performing inserts and moving items around the tree is more involved, as there’s some extra work required to keep the tree structure in a good state at all times.
django-mptt
django-mptt is a reusable Django app which aims to make it easy to work with MPTT models.
Feature overview
- Simple registration of models - fields required for tree structure will be added automatically.
- The tree structure is automatically updated when you create or delete model instances, or change an instance’s parent.
- Each level of the tree is automatically sorted by a field (or fields) of your choice.
- New model methods are added to each registered model for:
- changing position in the tree
- retrieving ancestors, siblings, descendants
- counting descendants
- other tree-related operations
- A
TreeManagermanager is added to all registered models. This provides methods to:- move nodes around a tree, or into a different tree
- insert a node anywhere in a tree
- rebuild the MPTT fields for the tree (useful when you do bulk updates outside of django)
- Form fields for tree models.
- Utility functions for tree models.
- Template tags and filters for rendering trees.
Installation
pip install django-mptt
INSTALLED_APPS = (
'django.contrib.auth',
# ...
'mptt',
)Setup Model
You must define a parent field which is a TreeForeignKey to 'self'. A TreeForeignKey is just a regular ForeignKey that renders form fields differently in the admin and a few other places.
Because you’re inheriting from MPTTModel, your model will also have a number of other fields: level, lft, rght, and tree_id. These fields are managed by the MPTT algorithm. Most of the time you won’t need to use these fields directly.
from django.db import models
from mptt.models import MPTTModel, TreeForeignKey
class Genre(MPTTModel):
name = models.CharField(max_length=50, unique=True)
parent = TreeForeignKey('self', on_delete=models.CASCADE, null=True, blank=True, related_name='children')
class MPTTMeta:
order_insertion_by = ['name']order_insertion_by : indicates the natural ordering of the data in the tree.
JWT
JWT Create
from rest_framework import generics, status
from rest_framework.response import Response
import jwt
class TokenCreateView(generics.CreateAPIView):
def post(self, request, *args, **kwargs):
username = request.data.get('username')
password = request.data.get('password')
user = authenticate(username=username, password=password)
if user:
payload = {'user_id': user.id}
token = jwt.encode(payload, 'secret_key', algorithm='HS256').decode('utf-8')
return Response({'token': token})
return Response({'error': 'Invalid credentials'}, status=status.HTTP_400_BAD_REQUEST)JWT Verify in Middleware
from django.contrib.auth import get_user_model
import jwt
class TokenAuthMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
token = request.META.get('HTTP_AUTHORIZATION', '').split()
if token:
try:
decoded_token = jwt.decode(token, 'secret_key', algorithms=['HS256'])
user = get_user_model().objects.get(id=decoded_token['user_id'])
request.user = user
except jwt.DecodeError:
# The token is invalid or has been tampered with
pass
except get_user_model().DoesNotExist:
# The user doesn't exist
pass
response = self.get_response(request)
return responseLogs
https://docs.djangoproject.com/en/5.1/howto/logging/
Setup
import logging
logger = logging.getLogger(__name__)Message
def some_view(request):
...
if some_risky_state:
logger.warning("Platform is running at risk")Log Levels
https://docs.djangoproject.com/en/5.1/topics/logging/#topic-logging-parts-loggers
DEBUG: Low level system information for debugging purposes
INFO: General system information
WARNING: Information describing a minor problem that has occurred.
ERROR: Information describing a major problem that has occurred.
CRITICAL: Information describing a critical problem that has occurred.Testing
https://docs.djangoproject.com/en/5.1/topics/testing/overview/
Test Database
- Separate, blank databases are created for the tests.
- Regardless of whether the tests pass or fail, the test databases are destroyed when all the tests have been executed.
Cases of database preservation:
- Prevent in command:
test --keepdboption. This will preserve the test database between runs. If the database does not exist, it will first be created. Any migrations will also be applied in order to keep it up to date.
- If a test run is forcefully interrupted. On the next run, you’ll be asked whether you want to reuse or destroy the database. Use the
test --noinputoption to suppress that prompt and automatically destroy the database. This can be useful when running tests on a continuous integration server where tests may be interrupted by a timeout, for example.
- The default test database names are created by prepending
test_to the value of eachNAMEinDATABASES.
- When using SQLite, the tests will use an in-memory database by default (bypassing the filesystem entirely). The
TESTdictionary inDATABASESoffers a number of settings to configure your test database. For example, if you want to use a different database name, specifyNAMEin theTESTdictionary for any given database inDATABASES. If using an SQLite in-memory database with SQLite, shared cache is enabled, so you can write tests with ability to share the database between threads.
- On PostgreSQL,
USERwill also need read access to the built-inpostgresdatabase.
The test database is created by the user specified by USER, so you’ll need to make sure that the given user account has sufficient privileges to create a new database on the system.- For fine-grained control over the character encoding of your test database, use the
CHARSETTEST option.
- For MySQL, you can also use the
COLLATIONoption to control the particular collation used by the test database. See the settings documentation for details of these and other advanced settings.
Speeding up the tests
Running tests in parallel¶
As long as your tests are properly isolated, you can run them in parallel to gain a speed up on multi-core hardware. See test --parallel.
Simple CRUD Test Cases
With JWT token authentication and user creation
URL patterns should have a name attribute for each endpoint for the “reverse” function in testcase class to work.
setUp(): This method is executed before each individual test method in your test case class. Each test case will have its own isolatedsetUp()method execution, so any objects created here will be available only in the specific test method it’s associated with.
setUpTestData(): This method is executed once per test case class and is run before any tests. It is designed for setting up data that needs to be shared among all test methods in that class, and the data is created only once (not before each test method). It is class-level setup, not instance-level, meaning that it runs only once and doesn’t repeat before each test method.
CodeView, Model and URL files
class ItemCategoryViewSet(viewsets.ModelViewSet): serializer_class = ItemCategorySerializer permission_classes = [IsAuthenticated] queryset = ItemCategory.objects.all()class ItemCategory(models.Model): name = models.CharField(max_length=50, unique=True) created_at = models.DateTimeField(auto_now_add=True) updated_at = models.DateTimeField(auto_now=True) def __str__(self): return self.nameurlpatterns = [ path( "item-category/", ItemCategoryViewSet.as_view({"get": "list", "post": "create"}), name="product-category", ), path( "item-category/<int:pk>/", ItemCategoryViewSet.as_view( {"get": "retrieve", "put": "update", "delete": "destroy"} ), name="product-category-detail", ), ]
class ProductCategoryViewTestCase(APITestCase): @classmethod def setUpTestData(cls): # add test user cls.username = "testuser" cls.password = "testpassword" cls.user = User.objects.create_user( username=cls.username, password=cls.password ) # add test data cls.product_category = ItemCategory.objects.create(name="test_category_name") def setUp(self): # user self.username = "testuser" self.password = "testpassword" # authenticate login_url = reverse("login") login_response = self.client.post( login_url, {"username": self.username, "password": self.password} ) self.assertEqual(login_response.status_code, status.HTTP_200_OK) # authorize self.access_token = login_response.json().get("access") self.client.credentials(HTTP_AUTHORIZATION=f"Bearer {self.access_token}") # setup url self.url = reverse("product-category") def test_authentication_required(self): """Ensure that authentication is required to access the view.""" self.client.credentials() # Remove the token response = self.client.get(self.url) self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED) def test_get_product_category_list_authenticated(self): """Test retrieving a list of categories for an authenticated user.""" response = self.client.get(self.url) self.assertEqual(response.status_code, status.HTTP_200_OK) # check data self.assertIsInstance(response.data, list) self.assertGreater(len(response.data), 0) expected_keys = { "id", "name", "created_at", "updated_at", } item = response.data[0] self.assertEqual(set(item.keys()), expected_keys) def test_get_product_category_detail(self): """Test retrieving a single category detail for an authenticated user.""" url = reverse("product-category-detail", args=[self.product_category.id]) response = self.client.get(url) self.assertEqual(response.status_code, status.HTTP_200_OK) expected_keys = { "id", "name", "created_at", "updated_at", } self.assertEqual(set(response.data.keys()), expected_keys) def test_create_category(self): """Test creating a category.""" post_data = { "name": "Test Category", } response = self.client.post(self.url, data=post_data, format="json") self.assertEqual(response.status_code, status.HTTP_201_CREATED) self.assertEqual(response.data["name"], post_data["name"]) item_exists = ItemCategory.objects.filter(name=post_data["name"]).exists() self.assertTrue(item_exists) def test_update_category(self): """Test updating a category.""" url = reverse("product-category-detail", args=[self.product_category.id]) data = {"name": "Updated Category"} response = self.client.put(url, data) self.assertEqual(response.status_code, status.HTTP_200_OK) self.product_category.refresh_from_db() self.assertEqual(self.product_category.name, data["name"]) def test_delete_category(self): """Test deleting a category.""" url = reverse("product-category-detail", args=[self.product_category.id]) response = self.client.delete(url) self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT) self.assertFalse( ItemCategory.objects.filter(id=self.product_category.id).exists() ) def tearDown(self): return super().tearDown()
Code Coverage
pip install coverage
poetry add coverage //if poetry is usedRun
coverage run manage.py test
coverage report -m
//poetry:
poetry run coverage run manage.py test
poetry run coverage report -mRun For Specific App
coverage run --source=todo_app manage.py test todo_app
coverage report -mRun with excluding directories
poetry run coverage run --source=purchase --omit=purchase/migrations/ manage.py test purchasepyproject.toml
[tool.coverage.run]
omit = [
# omit anything in a .local directory anywhere
"*/.local/*",
# omit everything in /usr
"/usr/*",
# omit this single file
"utils/tirefire.py",
]Requires: poetry add coverage\[toml\]
Exclude Parts of Code
By default, any line with a comment of pragma: no cover is excluded. If that line introduces a clause, for example, an if clause, or a function or class definition, then the entire clause is also excluded. Here the __repr__ function is not reported as missing:
class MyObject(object):
def __init__(self):
blah1()
blah2()
def __repr__(self): # pragma: no cover
return "<MyObject>"Visualize
poetry run coverage html
https://www.digitalocean.com/community/tutorials/how-to-set-up-django-with-postgres-nginx-and-gunicorn-on-ubuntu-22-04
https://www.hacksoft.io/blog/direct-to-s3-file-upload-with-django
