Comprehensive Guide to Backend
Development with Supabase and
MongoDB
Introduction
Part 1: Supabase
1.1 Supabase Overview
1.2 Supabase Authentication
1.3 Supabase Cloud Storage
Part 2: MongoDB
2.1 MongoDB Overview
2.2 MongoDB Authentication
2.3 MongoDB CRUD Operations
Part 3: Practical Tasks and Code Examples
3.1 Supabase Authentication Tasks
3.2 Supabase Cloud Storage Tasks
3.3 MongoDB CRUD Operations Tasks
Part 4: Interactive Learning Environment
4.1 Supabase Sample Project
4.2 MongoDB Sample Project
Conclusion
References
This comprehensive guide is designed to provide a detailed understanding of backend
web development using two powerful and popular platforms: Supabase and
MongoDB. Whether you are a beginner looking to grasp the fundamentals or an
experienced developer seeking to expand your knowledge, this guide will walk you
through the core concepts, features, and practical applications of both technologies.
Supabase, an open-source Firebase alternative, offers a suite of backend services
including a PostgreSQL database, authentication, real-time subscriptions, and storage.
Its focus on simplicity and developer-friendliness makes it an excellent choice for
rapidly building scalable applications. We will delve into its authentication
mechanisms, exploring various login methods and security features, as well as its
robust cloud storage capabilities for managing user-generated content and other files.
MongoDB, a leading NoSQL document database, provides a flexible and scalable
solution for storing and retrieving data. Its document-oriented model allows for
dynamic schemas, making it ideal for modern applications with evolving data
requirements. We will cover its authentication methods to secure your data and
explore the fundamental Create, Read, Update, and Delete (CRUD) operations that
form the backbone of data interaction.
Beyond theoretical explanations, this guide emphasizes hands-on learning. Each
section will be complemented with practical tasks and code examples, enabling you to
apply what you learn immediately. By the end of this guide, you will not only have a
solid theoretical foundation but also practical experience in building robust backend
systems with Supabase and MongoDB.
Part 1: Supabase
1.1 Supabase Overview
Supabase is an open-source Backend-as-a-Service (BaaS) platform that provides
developers with all the tools needed to build a backend without writing a single line of
backend code. It positions itself as an open-source alternative to Google's Firebase,
offering a similar suite of services but built around a PostgreSQL database. This choice
of PostgreSQL as its core database is a significant differentiator, providing the
familiarity and power of a relational database while still offering features commonly
found in NoSQL databases, such as real-time capabilities and flexible data structures
[1].
At its heart, Supabase provides a full Postgres database for every project. This means
developers get all the benefits of a mature, reliable, and feature-rich relational
database, including ACID compliance, robust querying capabilities, and extensibility.
Supabase extends Postgres with real-time functionality, allowing applications to listen
for database changes and react instantly. This is achieved through its Realtime Server,
which broadcasts database changes to subscribed clients [14].
Supabase offers a comprehensive set of features that simplify backend development:
Database: A dedicated PostgreSQL database for each project, offering full access
and control. It supports standard SQL queries and allows for complex data
modeling.
Authentication: A robust and secure authentication system that supports
various sign-in methods, including email/password, social logins (Google,
GitHub, etc.), magic links, and phone authentication. It also includes features like
Multi-Factor Authentication (MFA) and Row Level Security (RLS) for fine-grained
access control [5].
Storage: An S3-compatible object storage service for managing large files like
images, videos, and documents. It integrates seamlessly with Supabase Auth for
secure file access and supports resumable uploads [1].
Edge Functions: Serverless functions that can be deployed globally and
executed close to the user, reducing latency. These are powered by Deno and can
be used for custom backend logic, integrations, and more [9].
Realtime: As mentioned, this feature allows applications to listen to database
changes in real-time, enabling dynamic and interactive user experiences.
Auto-generated APIs: Supabase automatically generates RESTful and GraphQL
APIs directly from your PostgreSQL database schema. This means you can
interact with your database using standard HTTP requests without manually
building an API layer [9].
Supabase's architecture is designed to be modular and extensible. It leverages existing
open-source tools and combines them into a cohesive platform. For instance, it uses
PostgREST to turn your PostgreSQL database into a RESTful API, GoTrue for
authentication, and Storage for file management. This open-source approach provides
transparency, flexibility, and the ability for developers to self-host if desired [11].
In summary, Supabase aims to accelerate application development by providing a
powerful, scalable, and easy-to-use backend infrastructure. Its focus on PostgreSQL,
combined with a rich set of integrated services, makes it a compelling choice for a
wide range of web and mobile applications.
References
[1] Supabase | The Postgres Development Platform. URL: https://supabase.com/ [5]
Auth | Built-in user management - Supabase. URL: https://supabase.com/auth [9]
Simplifying back-end complexity with Supabase Data APIs. URL:
https://supabase.com/blog/simplify-backend-with-data-api [11] Architecture |
Supabase Docs. URL: https://supabase.com/docs/guides/getting-started/architecture
[14] Database | Supabase Docs. URL:
https://supabase.com/docs/guides/database/overview
1.2 Supabase Authentication
Supabase Authentication is a comprehensive and flexible system designed to manage
user identities and access control within your applications. It is built on top of GoTrue,
an open-source JWT-based API for managing users and issuing JSON Web Tokens
(JWTs) [16]. This architecture ensures secure and stateless authentication, allowing for
easy integration with various frontend frameworks and client-side applications.
Key features of Supabase Authentication include:
Multiple Sign-in Methods: Supabase supports a wide array of authentication
methods to cater to diverse user preferences and application requirements.
These include:
Email and Password: The most common authentication method, allowing
users to register and log in using their email address and a secure
password. Supabase handles email verification and password reset flows
[10].
Social Logins (OAuth): Integration with popular OAuth providers such as
Google, GitHub, Facebook, Twitter, and more. This simplifies the
registration and login process for users, leveraging their existing accounts
[7].
Magic Links (Passwordless Login): A convenient and secure method where
users receive a unique, time-limited link via email. Clicking this link
authenticates them without requiring a password, enhancing user
experience and reducing friction [4].
Phone Logins: Users can authenticate using their phone numbers, typically
involving a one-time password (OTP) sent via SMS [2].
Multi-Factor Authentication (MFA): Adds an extra layer of security by
requiring users to verify their identity through multiple methods, such as a
password combined with a code from an authenticator app [11].
JSON Web Tokens (JWTs): Upon successful authentication, Supabase issues a
JWT to the client. This token contains information about the authenticated user
and is used to authorize subsequent requests to the Supabase API and your
PostgreSQL database. JWTs are cryptographically signed, ensuring their integrity
and authenticity [8].
Row Level Security (RLS): This is a powerful feature that works in conjunction
with Supabase Authentication to provide fine-grained access control directly
within your PostgreSQL database. RLS allows you to define policies that restrict
which rows a user can access, insert, update, or delete based on their
authentication status and other attributes. For example, you can ensure that
users can only view or modify their own data [19].
Server-Side Authentication: While Supabase is often used with client-side
applications, it also provides tools and utilities for implementing secure user
authentication in server-side environments. This is crucial for applications that
require server-side rendering or have complex backend logic [3].
Customization and Extensibility: Supabase Auth is highly customizable. You can
configure email templates for verification and password resets, integrate with
custom SMTP providers, and even extend the authentication flow with webhooks
and custom logic. The open-source nature of GoTrue also allows for deeper
customization if needed.
Security Best Practices: Supabase incorporates various security measures,
including CAPTCHA protection to prevent bot attacks, secure password storage
using salting and hashing, and protection against common web vulnerabilities
[1].
In essence, Supabase Authentication simplifies the complex task of user management
and security, allowing developers to focus on building their core application features.
Its integration with PostgreSQL and RLS provides a seamless and secure data access
layer, making it a robust choice for modern web and mobile applications.
References
[1] Supabase | The Postgres Development Platform. URL: https://supabase.com/ [2]
Features | Supabase Docs. URL: https://supabase.com/docs/guides/getting-
started/features [3] Server-side Auth | Supabase Features. URL:
https://supabase.com/features/server-side-auth [4] Supabase Auth: The Ultimate
Guide to User Authentication. URL: https://www.supaboost.dev/blog/supabase-auth
[7] Top 13 Supabase Features You Should Know In 2024 | Lanex AU. URL:
https://lanex.au/blog/top-13-supabase-features-you-should-know-in-2024/ [8] Using
Supabase as an Auth Service - DepsHub. URL: https://depshub.com/blog/using-
supabase-auth-as-a-service-with-a-custom-backend/ [10] Supabase vs Clerk -
DevTools Academy. URL: https://www.devtoolsacademy.com/blog/supabase-vs-clerk/
[11] Multi-Factor Authentication (MFA) | Supabase Features. URL:
https://supabase.com/features/multi-factor-authentication [16] supabase/auth: A JWT
based API for managing users and ... - GitHub. URL: https://github.com/supabase/auth
[19] Authorization via Row Level Security | Supabase Features. URL:
https://supabase.com/features/row-level-security
1.3 Supabase Cloud Storage
Supabase Cloud Storage provides a robust and scalable solution for managing user-
generated content and other files within your application. It is an S3-compatible object
storage service, meaning it adheres to the widely adopted Amazon S3 API, making it
familiar to developers who have worked with cloud storage services before. This
compatibility also allows for easy migration and integration with existing tools and
libraries that support S3 [7].
Key aspects and features of Supabase Cloud Storage include:
Object Storage Model: Supabase Storage stores data as objects within buckets.
An object can be any type of file (images, videos, documents, audio, etc.), and a
bucket is a container for these objects. This model is highly scalable and cost-
effective for storing large amounts of unstructured data [1].
Integration with Supabase Auth: One of the significant advantages of Supabase
Storage is its native integration with Supabase Authentication. This allows you to
define granular access rules for your files using PostgreSQL Row Level Security
(RLS). For example, you can create policies that ensure only authenticated users
can upload files, or that users can only access files they have uploaded
themselves. This eliminates the need for complex server-side logic to manage file
permissions [1].
Public and Private Files: You can configure buckets and individual files to be
either public or private. Public files are accessible via a direct URL without any
authentication, suitable for assets like profile pictures or public documents.
Private files require authentication and authorization to access, ensuring
sensitive data remains secure [1].
Resumable Uploads: For large files, Supabase Storage supports resumable
uploads. This feature allows users to pause and resume file uploads without
losing progress, which is particularly useful for handling high-quality video
uploads or large documents, improving the user experience in case of network
interruptions [6].
Image Optimization and Transformation: While not explicitly detailed in the
initial search results, many modern cloud storage solutions offer features like
image resizing, compression, and format conversion on the fly. Given Supabase's
comprehensive nature, it's reasonable to expect or integrate solutions for such
optimizations to improve application performance and reduce storage costs.
Programmatic Access: Supabase provides client-side SDKs (e.g., JavaScript,
Flutter, Swift) that allow you to interact with the storage service directly from
your frontend application. This includes functions for uploading, downloading,
listing, and deleting files. You can also generate signed URLs for temporary,
secure access to private files [4].
Scalability and Performance: Built on scalable infrastructure, Supabase Storage
is designed to handle a high volume of uploads and downloads, ensuring good
performance even as your application grows. It leverages technologies like
Cloudflare CDN for faster content delivery [1].
Cost-Effective: Supabase offers a generous free tier for storage, with pricing
scaling based on usage (storage size and bandwidth). This makes it an attractive
option for projects of all sizes, from small personal projects to large-scale
applications [3].
In summary, Supabase Cloud Storage provides a secure, scalable, and developer-
friendly solution for managing files. Its tight integration with Supabase Auth and the
power of RLS make it a compelling choice for applications requiring robust file
management capabilities.
References
[1] Storage | Store any digital content - Supabase. URL: https://supabase.com/storage
[3] Pricing & Fees - Supabase. URL: https://supabase.com/pricing [4] How To Use
Supabase Storage For Uploading Files In Your API ... - YouTube. URL:
https://www.youtube.com/watch?v=O0FqE0Rg7n0 [6] Resumable uploads | Supabase
Features. URL: https://supabase.com/features/resumable-uploads [7]
supabase/storage: S3 compatible object storage service ... - GitHub. URL:
https://github.com/supabase/storage
Part 2: MongoDB
2.1 MongoDB Overview
MongoDB is a leading open-source, document-oriented NoSQL database. Unlike
traditional relational databases that store data in tables with predefined schemas,
MongoDB stores data in flexible, JSON-like documents called BSON (Binary JSON).
This document model allows for rich and hierarchical data structures, making it highly
adaptable to modern application development where data structures can evolve
rapidly [1].
Key characteristics and features of MongoDB include:
Document Model: The core of MongoDB is its document-oriented data model.
Each record in MongoDB is a document, which is a data structure composed of
field and value pairs, similar to JSON objects. This flexibility means that
documents within the same collection do not need to have the same set of fields
or structure, allowing for agile development and easy schema evolution [1].
Collections and Databases: In MongoDB, documents are stored in collections,
which are analogous to tables in relational databases. Collections, in turn, are
organized within databases. A single MongoDB instance can host multiple
databases, and each database can contain multiple collections [11].
High Performance: MongoDB is designed for high performance. It achieves this
through features like:
Indexing: Supports various types of indexes to improve query performance,
including primary, compound, geospatial, and text indexes [8].
In-memory Storage Engine: For specific workloads, MongoDB can leverage
an in-memory storage engine to provide extremely fast data access.
High Availability: MongoDB provides robust high availability through replica
sets. A replica set is a group of MongoDB servers that maintain the same data set,
providing redundancy and increasing data availability. If a primary server fails, an
election process ensures that a new primary is chosen from the remaining
members, minimizing downtime [5].
Scalability: MongoDB is built for horizontal scalability using sharding. Sharding
distributes data across multiple servers (shards) in a cluster, allowing the
database to handle larger data sets and higher throughput than a single server
could. This makes MongoDB suitable for applications with massive data volumes
and high traffic [5].
Rich Query Language: MongoDB provides a powerful and expressive query
language that supports a wide range of operations, including filtering, sorting,
aggregation, and geospatial queries. This allows developers to retrieve and
manipulate data efficiently [1].
Flexible Schema: The dynamic schema of MongoDB is a significant advantage. It
allows developers to change the structure of documents without requiring a full
schema migration, which can be a time-consuming and complex process in
relational databases. This flexibility is particularly beneficial in agile
development environments [1].
Drivers and Tools: MongoDB offers official drivers for a wide range of
programming languages (e.g., Python, Node.js, Java, C#), making it easy to
integrate into various application stacks. It also provides a rich ecosystem of
tools for administration, monitoring, and data visualization.
MongoDB is widely used in various industries for applications requiring high
scalability, flexibility, and performance, such as content management systems, mobile
applications, real-time analytics, and IoT data processing.
References
[1] Introduction to MongoDB - Database Manual. URL:
https://www.mongodb.com/docs/manual/introduction/ [5] MongoDB Overview -
Tutorials Point. URL:
https://www.tutorialspoint.com/mongodb/mongodb_overview.htm [8] Introduction
to MongoDB - Tanja Adžić. URL: https://adzic-tanja.medium.com/introduction-to-
mongodb-a1c574b331e2 [11] MongoDB - Database, Collection, and Document -
GeeksforGeeks. URL: https://www.geeksforgeeks.org/mongodb/mongodb-database-
collection-and-document/
2.2 MongoDB Authentication
MongoDB provides robust authentication mechanisms to ensure that only authorized
users can access and interact with your database. Implementing authentication is a
critical security measure, especially for production deployments. MongoDB supports
various authentication methods, allowing administrators to choose the most suitable
approach based on their security requirements and existing infrastructure [2].
Key aspects of MongoDB Authentication include:
User-Based Authentication: MongoDB's primary authentication model is user-
based. You create users within the database, and each user is associated with
specific roles that define their privileges (e.g., read-only access, read-write
access, administrative privileges). When a client attempts to connect to
MongoDB, they must provide valid credentials (username and password) to
authenticate [2].
Authentication Mechanisms: MongoDB supports several authentication
mechanisms, each offering different levels of security and integration
capabilities:
SCRAM (Salted Challenge Response Authentication Mechanism): This is
the default and recommended authentication mechanism for MongoDB.
SCRAM uses a challenge-response approach with salted hashes to protect
user credentials. It supports both SCRAM-SHA-1 and the more secure
SCRAM-SHA-256, which uses a stronger hashing algorithm [5, 16].
x.509 Certificate Authentication: This mechanism uses X.509 certificates
for authentication, providing strong security through public key
infrastructure. It is often used in environments where certificate
management is already in place and offers mutual authentication (both
client and server verify each other's identity) [4].
LDAP (Lightweight Directory Access Protocol) Authentication: MongoDB
Enterprise (a commercial version of MongoDB) supports integration with
LDAP directories. This allows organizations to authenticate MongoDB users
against their existing centralized user directories, simplifying user
management and enforcing corporate security policies [3].
Kerberos Authentication: Another enterprise-grade authentication
mechanism, Kerberos provides strong network authentication for
distributed systems. It is commonly used in large corporate environments
for single sign-on (SSO) capabilities [19].
OpenID Connect (OIDC) Authentication: MongoDB Enterprise also
supports OpenID Connect, an authentication layer built on top of OAuth 2.0.
This enables integration with identity providers that support OIDC,
facilitating modern authentication flows [1].
AWS IAM Authentication: For deployments on Amazon Web Services
(AWS), MongoDB supports authentication using AWS Identity and Access
Management (IAM) credentials, allowing for seamless integration with AWS
security practices [4].
Role-Based Access Control (RBAC): MongoDB implements RBAC, where
privileges are granted to roles, and roles are then assigned to users. This
simplifies permission management, as you can define a set of roles with specific
permissions and then assign these roles to multiple users. MongoDB provides a
set of built-in roles, and you can also define custom roles to meet specific
application needs [9].
Enabling Authentication: By default, MongoDB might run without
authentication enabled, especially in development environments. For
production deployments, it is crucial to enable authentication by configuring the
security.authorization setting in the MongoDB configuration file. Once
enabled, all client connections will require authentication [6].
Auditing: MongoDB Enterprise provides auditing capabilities, allowing
administrators to track and log all authenticated operations performed on the
database. This is essential for security compliance and forensic analysis [6].
Properly configuring and managing authentication in MongoDB is vital for protecting
sensitive data and maintaining the integrity of your database. By leveraging the
various authentication mechanisms and RBAC, you can establish a secure and
controlled environment for your data.
References
[1] Authentication on Self-Managed Deployments - Database Manual. URL:
https://www.mongodb.com/docs/manual/core/authentication/ [2] MongoDB Features
& Key Characteristics. URL:
https://www.mongodb.com/resources/products/capabilities/features [3] MongoDB
Authentication: Best Practices for Securing Access. URL:
https://www.datasunrise.com/knowledge-center/mongodb-authentication/ [4]
MongoDB Modern Database With Security Capabilities. URL:
https://www.mongodb.com/products/capabilities/security [5] Authentication
Mechanisms in MongoDB - GeeksforGeeks. URL:
https://www.geeksforgeeks.org/authentication-mechanisms-in-mongodb/ [6] Security
- Database Manual - MongoDB Docs. URL:
https://www.mongodb.com/docs/manual/security/ [9] MongoDB Authorization: A
Practical Guide - Satori Cyber. URL: https://satoricyber.com/mongodb-authorization-a-
practical-guide/ [16] MongoDB authentication - DBeaver Documentation. URL:
https://dbeaver.com/docs/dbeaver/Authentication-MongoDB/ [19] MongoDB
Enterprise Authentication Methods: LDAP, OIDC, and ... URL:
https://studio3t.com/knowledge-base/articles/mongodb-enterprise-authentication/
2.3 MongoDB CRUD Operations
CRUD operations are the fundamental building blocks for interacting with any
database, and MongoDB is no exception. CRUD stands for Create, Read, Update, and
Delete, representing the four basic functions that can be performed on data. In
MongoDB, these operations are performed on documents within collections [2].
2.3.1 Create Operations (Insert)
Create operations, also known as insert operations, add new documents to a
collection. MongoDB provides several methods for inserting documents:
insertOne() : Inserts a single document into a collection. If the document does
not contain an _id field, MongoDB automatically adds one with a unique
ObjectId value [5]. javascript db.collection.insertOne({ name: "Alice",
age: 30, city: "New York" })
insertMany() : Inserts multiple documents into a collection. This method takes
an array of documents as its argument [5]. javascript
db.collection.insertMany([ { name: "Bob", age: 25, city: "London" },
{ name: "Charlie", age: 35, city: "Paris" } ])
2.3.2 Read Operations (Query)
Read operations, or queries, retrieve documents from a collection. MongoDB provides
a powerful find() method for querying documents, allowing for various filtering,
sorting, and projection options [1].
find() : Selects documents in a collection. The find() method can take a
query filter document to specify criteria for selecting documents. If no filter is
provided, it returns all documents in the collection [1]. ```javascript // Find all
documents db.collection.find({})
// Find documents where age is 30 db.collection.find({ age: 30 })
// Find documents where age is greater than 25 db.collection.find({ age: { $gt: 25 }
})
// Find documents where city is New York and age is less than 35
db.collection.find({ city: "New York", age: { $lt: 35 } }) ```
findOne() : Returns a single document that satisfies the specified query criteria.
If multiple documents satisfy the query, findOne() returns the first document
encountered [1]. javascript db.collection.findOne({ name: "Alice" })
Projection: You can specify which fields to return in the query result using a
projection document. A value of 1 includes the field, and 0 excludes it (the _id
field is included by default unless explicitly excluded) [10]. javascript //
Return only name and city fields db.collection.find({}, { name: 1,
city: 1, _id: 0 })
Sorting: The sort() method orders the results of a query. A value of 1 specifies
ascending order, and -1 specifies descending order [10]. javascript // Sort
by age in ascending order db.collection.find({}).sort({ age: 1 })
Limiting: The limit() method restricts the number of documents returned by a
query [10]. javascript // Return only the first 5 documents
db.collection.find({}).limit(5)
2.3.3 Update Operations
Update operations modify existing documents in a collection. MongoDB provides
methods to update a single document or multiple documents [7].
updateOne() : Updates a single document that matches the specified filter [7].
javascript // Update Alice's age to 31 db.collection.updateOne( {
name: "Alice" }, { $set: { age: 31 } } )
updateMany() : Updates all documents that match the specified filter [7].
javascript // Increment age of all documents by 1
db.collection.updateMany( {}, // Empty filter to select all documents
{ $inc: { age: 1 } } )
replaceOne() : Replaces a single document entirely with a new document. The
new document must not contain an _id field unless it is the same as the existing
_id [7]. javascript // Replace Bob's document
db.collection.replaceOne( { name: "Bob" }, { name: "Robert", status:
"active" } )
2.3.4 Delete Operations
Delete operations remove documents from a collection. MongoDB provides methods
to delete a single document or multiple documents [7].
deleteOne() : Deletes a single document that matches the specified filter [7].
javascript // Delete the document where name is Charlie
db.collection.deleteOne({ name: "Charlie" })
deleteMany() : Deletes all documents that match the specified filter. If an empty
filter {} is provided, it deletes all documents in the collection [7]. javascript
// Delete all documents where age is less than 20
db.collection.deleteMany({ age: { $lt: 20 } })
These CRUD operations form the foundation of data manipulation in MongoDB,
enabling developers to build dynamic and interactive applications.
References
[1] MongoDB CRUD Operations - Database Manual. URL:
https://www.mongodb.com/docs/manual/crud/ [2] MongoDB CRUD Operations -
GeeksforGeeks. URL: https://www.geeksforgeeks.org/mongodb/mongodb-crud-
operations/ [5] MongoDB CRUD Operations: Insert and Find Documents. URL:
https://learn.mongodb.com/courses/mongodb-crud-operations-insert-and-find-
documents [7] MongoDB CRUD Operations: Replace and Delete Documents. URL:
https://learn.mongodb.com/courses/mongodb-crud-operations-replace-and-delete-
documents [10] MongoDB CRUD Operations: Modifying Query Results. URL:
https://learn.mongodb.com/courses/mongodb-crud-operations-modifying-query-
results
Part 3: Practical Tasks and Code Examples
3.1 Supabase Authentication Tasks
To solidify your understanding of Supabase Authentication, here are some practical
tasks you can perform. These tasks will guide you through the process of setting up a
simple authentication flow in a web application. We will use JavaScript and the
Supabase client library for these examples.
Prerequisites:
1. Create a Supabase Project: If you haven't already, go to supabase.com, create a
new project, and get your Project URL and anon key from the API settings.
2. Set up a simple HTML file: Create an index.html file and include the Supabase
client library from a CDN:
html <!DOCTYPE html> <html> <head> <title>Supabase Auth
Example</title> <script
src="https://cdn.jsdelivr.net/npm/@supabase/supabase-js@2"></script>
</head> <body> <h1>Supabase Auth Example</h1> <script src="app.js">
</script> </body> </html>
3. Create a JavaScript file: Create an app.js file and initialize the Supabase
client:
```javascript const { createClient } = supabase
const SUPABASE_URL = 'YOUR_SUPABASE_URL'; const SUPABASE_ANON_KEY =
'YOUR_SUPABASE_ANON_KEY';
const supabase = createClient(SUPABASE_URL, SUPABASE_ANON_KEY); ```
Replace 'YOUR_SUPABASE_URL' and 'YOUR_SUPABASE_ANON_KEY' with your
actual project URL and anon key.
Task 1: Sign Up a New User
Objective: Implement a function to sign up a new user with an email and password.
**Code Example ( app.js ):
async function signUpNewUser(email, password) {
const { data, error } = await supabase.auth.signUp({
email: email,
password: password,
});
if (error) {
console.error('Error signing up:', error.message);
return null;
} else {
console.log('Sign up successful! Please check your email for
verification.');
return data.user;
}
}
// Example usage:
signUpNewUser('test@example.com', 'password123');
To Do:
1. Add the signUpNewUser function to your app.js file.
2. Call the function with a sample email and password.
3. Check the browser console for the output.
4. Go to your Supabase project dashboard, under Authentication > Users, and you
should see the new user.
Task 2: Log In a User
Objective: Implement a function to log in an existing user with their email and
password.
**Code Example ( app.js ):
async function logInUser(email, password) {
const { data, error } = await supabase.auth.signInWithPassword({
email: email,
password: password,
});
if (error) {
console.error('Error logging in:', error.message);
return null;
} else {
console.log('Login successful!');
return data.user;
}
}
// Example usage (after signing up and verifying the email):
logInUser('test@example.com', 'password123');
To Do:
1. Add the logInUser function to your app.js file.
2. Call the function with the email and password of the user you created in Task 1.
3. Check the browser console for the output.
Task 3: Fetch User Information
Objective: Implement a function to get the currently logged-in user's information.
**Code Example ( app.js ):
async function getUser() {
const { data: { user } } = await supabase.auth.getUser()
if (user) {
console.log('Current user:', user);
return user;
} else {
console.log('No user is currently logged in.');
return null;
}
}
// Example usage (after logging in):
getUser();
To Do:
1. Add the getUser function to your app.js file.
2. Call the function after a successful login.
3. Check the browser console for the user object.
Task 4: Log Out a User
Objective: Implement a function to log out the currently authenticated user.
**Code Example ( app.js ):
async function logOutUser() {
const { error } = await supabase.auth.signOut();
if (error) {
console.error('Error logging out:', error.message);
} else {
console.log('Logout successful!');
}
}
// Example usage (after logging in):
logOutUser();
To Do:
1. Add the logOutUser function to your app.js file.
2. Call the function after a successful login.
3. Call the getUser() function again to confirm that no user is logged in.
By completing these tasks, you will have a foundational understanding of how to
implement user authentication in your projects using Supabase.
3.2 Supabase Cloud Storage Tasks
These tasks will help you understand how to work with Supabase Cloud Storage for
uploading, downloading, and managing files. Before starting, ensure you have a
Supabase project set up and a storage bucket created.
Prerequisites:
1. Create a Storage Bucket: In your Supabase project dashboard, go to Storage and
create a new bucket (e.g., "user-uploads").
2. Set Bucket Policies: Configure the bucket to allow authenticated users to upload
and access files. You can set this up in the Storage > Policies section.
Task 5: Upload a File
Objective: Implement a function to upload a file to a Supabase storage bucket.
**HTML Addition ( index.html ):
<!-- Add this to your HTML body -->
<input type="file" id="fileInput" />
<button onclick="uploadFile()">Upload File</button>
**Code Example ( app.js ):
async function uploadFile() {
const fileInput = document.getElementById('fileInput');
const file = fileInput.files[0];
if (!file) {
console.error('No file selected');
return;
}
// Generate a unique filename
const fileName = `$`{Date.now()}_`${file.name}`;
const { data, error } = await supabase.storage
.from('user-uploads')
.upload(fileName, file);
if (error) {
console.error('Error uploading file:', error.message);
} else {
console.log('File uploaded successfully:', data);
console.log('File path:', data.path);
}
}
To Do:
1. Add the file input and button to your HTML.
2. Add the uploadFile function to your app.js file.
3. Select a file and click the upload button.
4. Check the browser console for the upload result.
5. Verify the file appears in your Supabase Storage dashboard.
Task 6: Download/Get File URL
Objective: Implement a function to get a public URL for an uploaded file.
**Code Example ( app.js ):
async function getFileUrl(filePath) {
const { data } = supabase.storage
.from('user-uploads')
.getPublicUrl(filePath);
console.log('File URL:', data.publicUrl);
return data.publicUrl;
}
// For private files, use createSignedUrl instead:
async function getSignedUrl(filePath, expiresIn = 60) {
const { data, error } = await supabase.storage
.from('user-uploads')
.createSignedUrl(filePath, expiresIn);
if (error) {
console.error('Error creating signed URL:', error.message);
return null;
} else {
console.log('Signed URL:', data.signedUrl);
return data.signedUrl;
}
}
// Example usage (replace with actual file path from upload):
getFileUrl('1640995200000_example.jpg');
To Do:
1. Add the getFileUrl and getSignedUrl functions to your app.js file.
2. Use the file path from a successful upload in Task 5.
3. Call the function and check the console for the URL.
4. Try opening the URL in a new browser tab to verify access.
Task 7: List Files in a Bucket
Objective: Implement a function to list all files in a storage bucket.
**Code Example ( app.js ):
async function listFiles() {
const { data, error } = await supabase.storage
.from('user-uploads')
.list('', {
limit: 100,
offset: 0,
});
if (error) {
console.error('Error listing files:', error.message);
} else {
console.log('Files in bucket:', data);
data.forEach(file => {
console.log(`- $`{file.name} (`${file.metadata?.size} bytes)`);
});
}
}
// Example usage:
listFiles();
To Do:
1. Add the listFiles function to your app.js file.
2. Call the function after uploading some files.
3. Check the console to see the list of files in your bucket.
Task 8: Delete a File
Objective: Implement a function to delete a file from the storage bucket.
**Code Example ( app.js ):
async function deleteFile(filePath) {
const { data, error } = await supabase.storage
.from('user-uploads')
.remove([filePath]);
if (error) {
console.error('Error deleting file:', error.message);
} else {
console.log('File deleted successfully:', data);
}
}
// Example usage (replace with actual file path):
deleteFile('1640995200000_example.jpg');
To Do:
1. Add the deleteFile function to your app.js file.
2. Use the file path from a previously uploaded file.
3. Call the function and check the console for the result.
4. Verify the file is removed from your Supabase Storage dashboard.
These tasks provide hands-on experience with the core file management operations in
Supabase Cloud Storage.
3.3 MongoDB CRUD Operations Tasks
These tasks will help you practice MongoDB CRUD operations using Node.js and the
MongoDB driver. You'll learn to create, read, update, and delete documents in a
MongoDB database.
Prerequisites:
1. Install MongoDB: You can either install MongoDB locally or use MongoDB Atlas
(cloud service).
2. Set up Node.js project: Create a new directory and initialize a Node.js project:
bash mkdir mongodb-crud-example cd mongodb-crud-example npm init -y
npm install mongodb
3. Create connection file: Create a db.js file to handle database connection:
```javascript const { MongoClient } = require('mongodb');
// Replace with your MongoDB connection string const uri =
'mongodb://localhost:27017'; // For local MongoDB // For MongoDB Atlas:
'mongodb+srv://username:password@cluster.mongodb.net/'
const client = new MongoClient(uri);
async function connectToDatabase() { try { await client.connect();
console.log('Connected to MongoDB'); return client.db('learning_db'); //
Database name } catch (error) { console.error('Error connecting to MongoDB:',
error); } }
module.exports = { connectToDatabase, client }; ```
Task 9: Create (Insert) Documents
Objective: Practice inserting single and multiple documents into a MongoDB
collection.
**Code Example ( create.js ):
const { connectToDatabase, client } = require('./db');
async function insertSingleUser() {
const db = await connectToDatabase();
const collection = db.collection('users');
const user = {
name: 'Alice Johnson',
email: 'alice@example.com',
age: 28,
city: 'New York',
createdAt: new Date()
};
try {
const result = await collection.insertOne(user);
console.log('User inserted with ID:', result.insertedId);
} catch (error) {
console.error('Error inserting user:', error);
}
}
async function insertMultipleUsers() {
const db = await connectToDatabase();
const collection = db.collection('users');
const users = [
{
name: 'Bob Smith',
email: 'bob@example.com',
age: 32,
city: 'London',
createdAt: new Date()
},
{
name: 'Charlie Brown',
email: 'charlie@example.com',
age: 25,
city: 'Paris',
createdAt: new Date()
},
{
name: 'Diana Prince',
email: 'diana@example.com',
age: 30,
city: 'Berlin',
createdAt: new Date()
}
];
try {
const result = await collection.insertMany(users);
console.log('Users inserted:', result.insertedCount);
console.log('Inserted IDs:', result.insertedIds);
} catch (error) {
console.error('Error inserting users:', error);
}
}
async function runCreateTasks() {
await insertSingleUser();
await insertMultipleUsers();
await client.close();
}
runCreateTasks();
To Do:
1. Create the create.js file with the code above.
2. Run the script: node create.js
3. Check the console output for successful insertions.
Task 10: Read (Query) Documents
Objective: Practice querying documents with various filters and options.
**Code Example ( read.js ):
const { connectToDatabase, client } = require('./db');
async function findAllUsers() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
const users = await collection.find({}).toArray();
console.log('All users:');
users.forEach(user => {
console.log(`- $`{user.name} (`${user.email}) - Age: ${user.age}`);
});
} catch (error) {
console.error('Error finding users:', error);
}
}
async function findUserByEmail(email) {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
const user = await collection.findOne({ email: email });
if (user) {
console.log('Found user:', user);
} else {
console.log('User not found');
}
} catch (error) {
console.error('Error finding user:', error);
}
}
async function findUsersWithFilters() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
// Find users older than 25
const olderUsers = await collection.find({ age: { $gt: 25 }
}).toArray();
console.log('Users older than 25:');
olderUsers.forEach(user => console.log(`- $`{user.name} (Age:
`${user.age})`));
// Find users with specific projection (only name and email)
const userNames = await collection.find({}, {
projection: { name: 1, email: 1, _id: 0 }
}).toArray();
console.log('User names and emails:');
userNames.forEach(user => console.log(`- $`{user.name}:
`${user.email}`));
// Find users sorted by age (ascending)
const sortedUsers = await collection.find({}).sort({ age: 1
}).toArray();
console.log('Users sorted by age:');
sortedUsers.forEach(user => console.log(`- $`{user.name} (Age:
`${user.age})`));
} catch (error) {
console.error('Error in filtered queries:', error);
}
}
async function runReadTasks() {
await findAllUsers();
console.log('\n---\n');
await findUserByEmail('alice@example.com');
console.log('\n---\n');
await findUsersWithFilters();
await client.close();
}
runReadTasks();
To Do:
1. Create the read.js file with the code above.
2. Run the script: node read.js
3. Observe the different query results in the console.
Task 11: Update Documents
Objective: Practice updating single and multiple documents.
**Code Example ( update.js ):
const { connectToDatabase, client } = require('./db');
async function updateSingleUser() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
// Update Alice's age
const result = await collection.updateOne(
{ email: 'alice@example.com' },
{
$set: {
age: 29,
updatedAt: new Date()
}
}
);
console.log('Documents matched:', result.matchedCount);
console.log('Documents updated:', result.modifiedCount);
} catch (error) {
console.error('Error updating user:', error);
}
}
async function updateMultipleUsers() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
// Add a status field to all users
const result = await collection.updateMany(
{}, // Empty filter to match all documents
{
$set: {
status: 'active',
updatedAt: new Date()
}
}
);
console.log('Documents matched:', result.matchedCount);
console.log('Documents updated:', result.modifiedCount);
} catch (error) {
console.error('Error updating users:', error);
}
}
async function incrementAge() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
// Increment age of users in London by 1
const result = await collection.updateMany(
{ city: 'London' },
{
$inc: { age: 1 },
$set: { updatedAt: new Date() }
}
);
console.log('Documents matched:', result.matchedCount);
console.log('Documents updated:', result.modifiedCount);
} catch (error) {
console.error('Error incrementing age:', error);
}
}
async function runUpdateTasks() {
await updateSingleUser();
console.log('\n---\n');
await updateMultipleUsers();
console.log('\n---\n');
await incrementAge();
await client.close();
}
runUpdateTasks();
To Do:
1. Create the update.js file with the code above.
2. Run the script: node update.js
3. Run the read script again to see the updated data: node read.js
Task 12: Delete Documents
Objective: Practice deleting single and multiple documents.
**Code Example ( delete.js ):
const { connectToDatabase, client } = require('./db');
async function deleteSingleUser() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
// Delete user with specific email
const result = await collection.deleteOne({ email:
'charlie@example.com' });
console.log('Documents deleted:', result.deletedCount);
if (result.deletedCount > 0) {
console.log('Charlie has been deleted');
} else {
console.log('No user found with that email');
}
} catch (error) {
console.error('Error deleting user:', error);
}
}
async function deleteMultipleUsers() {
const db = await connectToDatabase();
const collection = db.collection('users');
try {
// Delete users older than 30
const result = await collection.deleteMany({ age: { $gt: 30 } });
console.log('Documents deleted:', result.deletedCount);
} catch (error) {
console.error('Error deleting users:', error);
}
}
async function runDeleteTasks() {
await deleteSingleUser();
console.log('\n---\n');
await deleteMultipleUsers();
// Show remaining users
console.log('\nRemaining users:');
const db = await connectToDatabase();
const collection = db.collection('users');
const remainingUsers = await collection.find({}).toArray();
remainingUsers.forEach(user => {
console.log(`- $`{user.name} (`${user.email}) - Age: ${user.age}`);
});
await client.close();
}
runDeleteTasks();
To Do:
1. Create the delete.js file with the code above.
2. Run the script: node delete.js
3. Observe which users remain after the delete operations.
These tasks provide comprehensive hands-on experience with MongoDB CRUD
operations, covering all the fundamental database interactions you'll need in real
applications.
Part 4: Interactive Learning Environment
4.1 Supabase Sample Project
We have created a comprehensive React application that demonstrates Supabase
Authentication and Cloud Storage features. This interactive demo allows you to
experience the concepts covered in this guide firsthand.
Project Location: /sample_projects/supabase-demo/
Features Demonstrated:
1. User Authentication:
Sign up with email and password
Sign in with existing credentials
Magic link authentication (passwordless login)
User profile display
Sign out functionality
2. Cloud Storage:
File upload to user-specific folders
File listing and management
File download functionality
File deletion
Real-time storage updates
Setup Instructions:
1. Create a Supabase Project:
Go to supabase.com and create a new project
Note your Project URL and anon key from the API settings
2. Configure the Demo:
Open /sample_projects/supabase-demo/src/lib/supabase.js
Replace YOUR_SUPABASE_URL and YOUR_SUPABASE_ANON_KEY with your
actual values
3. Set up Storage Bucket:
In your Supabase dashboard, go to Storage
Create a new bucket named user-uploads
Configure bucket policies to allow authenticated users to upload and access
files
4. Run the Application: bash cd sample_projects/supabase-demo pnpm install
pnpm run dev
5. Test the Features:
Open your browser to the provided localhost URL
Try signing up with a new email
Check your email for verification
Sign in and explore the profile and storage features
Learning Objectives:
Understand how Supabase Auth integrates with React applications
Experience real-time authentication state management
Practice file upload and management workflows
See how Row Level Security works in practice
4.2 MongoDB Sample Project
We have created a RESTful API using Express.js that demonstrates all MongoDB CRUD
operations. This server provides a complete backend for managing user data with
proper error handling and validation.
Project Location: /sample_projects/mongodb-demo/
Features Demonstrated:
1. Create Operations:
Insert single user
Insert multiple users (bulk operation)
Data validation and duplicate checking
2. Read Operations:
Retrieve all users
Find user by ID
Query filtering and sorting
3. Update Operations:
Update single user by ID
Partial updates with validation
Timestamp tracking
4. Delete Operations:
Delete user by ID
Proper error handling for non-existent records
Setup Instructions:
1. Install MongoDB:
Option A - Local Installation: Install MongoDB Community Edition on your
system
Option B - MongoDB Atlas: Create a free cluster at mongodb.com/atlas
2. Configure the Demo:
Navigate to /sample_projects/mongodb-demo/
Copy .env.example to .env
Update the MONGODB_URI with your connection string
3. Install Dependencies: bash cd sample_projects/mongodb-demo npm install
4. Start the Server: bash npm start
5. Test the API:
The server will run on http://localhost:3000
Visit the root URL to see available endpoints
Use tools like Postman, curl, or your browser to test the API
Example API Calls:
# Create a new user
curl -X POST http://localhost:3000/api/users \
-H "Content-Type: application/json" \
-d '{
"name": "Alice Johnson",
"email": "alice@example.com",
"age": 28,
"city": "New York"
}'
# Get all users
curl http://localhost:3000/api/users
# Get user by ID (replace USER_ID with actual ID)
curl http://localhost:3000/api/users/USER_ID
# Update user
curl -X PUT http://localhost:3000/api/users/USER_ID \
-H "Content-Type: application/json" \
-d '{"age": 29}'
# Delete user
curl -X DELETE http://localhost:3000/api/users/USER_ID
Learning Objectives:
Practice building RESTful APIs with Express.js
Understand MongoDB connection management
Experience error handling and validation in backend applications
Learn proper API design patterns
Practice using MongoDB drivers and operations
Both sample projects provide hands-on experience with the concepts covered in this
guide. They serve as practical references you can modify and extend as you build your
own applications.
Conclusion
This comprehensive guide has taken you through the essential aspects of backend
development using two powerful platforms: Supabase and MongoDB. By combining
theoretical knowledge with practical, hands-on tasks, you now have a solid foundation
for building robust backend systems.
Key Takeaways:
Supabase offers a modern, developer-friendly approach to backend development
with its PostgreSQL-based platform. Its integrated authentication system, cloud
storage capabilities, and real-time features make it an excellent choice for rapid
application development. The platform's emphasis on open-source technologies and
ease of use allows developers to focus on building features rather than managing
infrastructure.
MongoDB provides the flexibility and scalability needed for modern applications with
evolving data requirements. Its document-oriented model, combined with powerful
CRUD operations and various authentication mechanisms, makes it suitable for a wide
range of use cases. Understanding MongoDB's operations is crucial for any developer
working with NoSQL databases.
Practical Experience: The hands-on tasks and sample projects in this guide have
provided you with real-world experience in:
Implementing user authentication flows
Managing file uploads and storage
Designing and building RESTful APIs
Performing database operations safely and efficiently
Handling errors and edge cases in backend applications
Next Steps:
1. Experiment with the Sample Projects: Modify and extend the provided
examples to suit your specific needs
2. Build Your Own Applications: Apply the concepts learned to create your own
projects
3. Explore Advanced Features: Dive deeper into advanced topics like database
optimization, security best practices, and scalability patterns
4. Stay Updated: Both Supabase and MongoDB are actively developed platforms
with regular updates and new features
Best Practices to Remember:
Always validate user input and handle errors gracefully
Implement proper authentication and authorization
Use environment variables for sensitive configuration
Follow RESTful API design principles
Test your applications thoroughly before deployment
Monitor and log your applications in production
The backend development landscape continues to evolve, but the fundamental
concepts covered in this guide will serve as a strong foundation for your journey as a
backend developer. Whether you choose Supabase for its simplicity and integrated
features or MongoDB for its flexibility and scalability, you now have the knowledge and
practical experience to build effective backend systems.
Remember that learning backend development is an iterative process. Continue
practicing, experimenting, and building projects to reinforce your understanding and
discover new possibilities with these powerful technologies.