NodeJS(Express) + MongoDB: How To Create A Blog API
A step-by-step approach to building a blog
Photo by Marten Bjork on Unsplash
Table of contents
- Introduction
- Prerequisites
- Setup Environment And Installations
- Overview
- Dependencies
- Project Structure
- Setting Up The Project
- Creating A Basic Express Server
- Understanding package.json, package-lock.json and node_modules
- Handling Sensitive Data
- Schema And Database Integration (with mongoDB)
- Users' Sign-up
- Registering The User
- Users' Login
- User Access
- Posts' Routes
- Posts' Function
- User Verification/Authentication
- Performing CRUD
- Data Validation
- Deployment
- Conclusion
- References
Introduction
A blog is a website or page that is a part of a larger website. Typically, it features articles written in a conversational style with accompanying pictures or videos.
Various types of blogs include tech, lifestyle, fashion, beauty, health and fitness, education, travel, photography, etc.
In this article, we are going to discuss how to build a fully-functional blog API and this covers:
Creating an express server
Database implementation with MongoDB
Users' routes
Posts' routes
Fetching data with queries
Security
Prerequisites
This blog will be built with Node.js(Express) and MongoDB
Basic JavaScript and ES6 knowledge
Understanding how blogs work
Basic understanding of Node.js and MongoDB database
A good understanding of VSCode
Setup Environment And Installations
Setting up Node.js
To install Node.js, click on this link, select LTS(Long Time Support version) and download based on your operating system.
Testing endpoints
PostMan or Thunder Client can be used for testing the routes/endpoints.
In this article, we shall be using Thunder Client which will be installed later in the article.
How to use Thunder Client.
MongoDBCompass
MongoDB is a document database and NoSQL database, designed for ease of development and scaling. It offers both local and cloud-based deployment options. In this article, we shall be using the local-based deployment option called 'MongoDB Compass'.
Click here for a step-by-step guide on how to download MongoDB Compass on your system.
Overview
Users have the privilege to signup for new accounts, log in with username and password after being verified, and create, read, update and delete posts.
Here are the API endpoints needed:
Methods | Endpoints | Actions |
POST | /api/auth/register | signup new users |
POST | /api/auth/login | login users |
GET | /api/users/ | view users' profile |
PUT | /api/users/:id | update a user's profile |
DELETE | /api/users/:id | delete a user account and content(optional) |
POST | /api/posts | create a post |
GET | /api/posts/:id | view user's post |
GET | /api/publishedPosts | view all published posts |
GET | /api/publishedPosts?title=title_of_post | search published posts by title |
GET | /api/publishedPosts?author=author_of_post | search published posts by the author |
GET | /api/publishedPosts?tag=post's_tag | search published posts by tag |
PUT | /api/posts/:id | update a post |
DELETE | /api/posts/:id | delete a post |
Dependencies
nodemon - nodemon is a tool that helps develop node-based applications by automatically restarting the node application when file changes in the directory are detected.
Express - Express is a free and open-source Node.js web application framework. Built on top of the Node.js built-in HTTP module, Express helps us set up routing and handle the request/response cycle.
bcrypt - bcrypt turns a simple password into fixed-length characters called a hash. Before hashing a password, bcrypt applies a salt, a unique random string that makes the hash unpredictable.
body-parser - body-parser is the Node.js body parsing middleware. It is responsible for parsing the incoming request bodies in a middleware before you handle it.
crypto-js - crypto-js is a module in Node. js which deals with an algorithm that performs data encryption and decryption. This is used for security purposes like user authentication where storing the password in Database in the encrypted form.
dotenv - The dotenv package is a great way to keep passwords, API keys, and other sensitive data out of your code. It allows you to create environment variables in a . env file instead of putting them in your code.
express-rate-limit - express-rate-limiting is a strategy you can use to control traffic on a network, secure backend API and limits the number of unwanted requests that a user can make within a specific time frame.
helmet - helmet fills in the gap between Node.js and Express.js by securing HTTP headers that are returned by your Express apps.
joi - joi is used for schema description and data validation.
joi-password - joi-password is an extension of joi that helps in validating passwords.
jsonwebtoken - Often called JWT, or JSON Web Token. JWTs are mainly used for authentication. After a user signs in to an application, the application then assigns JWT to that user. Subsequent requests by the user will include the assigned JWT. This token tells the server what routes, services, and resources the user is allowed to access .is an open standard used to share secured information between two parties securely — a client and a server.
mongoose- mongoose is an Object Data Modeling (ODM) library for MongoDB, distributed as an npm package. It allows developers to enforce a specific schema at the application layer.
morgan - morgan is a Node. js and Express middleware to log(taking records of events that occur in a software) HTTP requests and errors, and simplifies the process.
morgan-json - morgan-json provides format functions that output JSON(Javascript Object Object Notation).
winston - winston is designed to be a simple and universal logging library with support for multiple transports.
Project Structure
Setting Up The Project
On the desktop, create a folder and call it MyBlogAPI
. Open this folder in your VS Code. To confirm if node was properly installed, type node --version
or node -v
in the terminal. This shows you the version of node installed.
Then type npm init -y
. This will create a package.json
file without going through an interactive process. The -y
stands for yes to all prompts.
Alternative to npm
, other package manager such as yarn
can be used .
Check here to find which of them suits you.
terminal
npm init -y
package.json file
{
"name": "myblogapi",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Creating A Basic Express Server
Here, we need to install some dependencies - express
, dotenv
and nodemon
- which we shall be used in this section to create a server.
In the terminal, type the code below:
terminal
npm install express dotenv
or
npm i express dotenv
and
npm install --save-dev nodemon
Notice how nodemon was installed in a slightly different way, this is so because it's a development dependency which means it's only used during development, unlike express, mongoose and dotenv which are used in both development and production.
You'll notice that the word install
can be replaced with the letter i
, and numerous dependencies can be installed all at once by calling them all together, just as we've used. These are just shortcuts used during installation.
package.json file
{
"name": "myblogapi",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"dotenv": "^16.0.3",
"express": "^4.18.2",
"mongoose": "^6.8.2"
},
"devDependencies": {
"nodemon": "^2.0.20"
}
}
After successfully installing the dependencies(sometimes also referred to as libraries), we create an app.js
file in the MyBlogAPI
folder. The app.js
should contain the code below:
app.js file
const express = require('express')
const app = express()
const PORT = 6060
app.listen(PORT, () => {
console.log(`Server is running at port http://localhost:${PORT}`)
})
In the code snippet above, the word require
is used to import the installed libraries to be used. We used this to import the express
library and then create an instance of the library called app
. PORT
is set to 6060, you can however choose to set your port to any 4-digit value greater than 1024, as long as it's not been used by other programs on your system.
To run app.js
, go to the terminal and run nodemon app.js
or just nodemon app
this is going to start the server.
terminal
nodemon app
Any time there's a change made in the code and saved, the server restarts automatically. This is nodemon in action!
Understanding package.json, package-lock.json and node_modules
You will notice two(2) files and one(1) folder that got created automatically in MyBlogAPI
folder - package.json
, package-lock.json
and node_modules
respectively. This is normal and it happens when we install our first dependency.
The package.json
file records important metadata about a project, it also defines functional attributes of a project that npm uses to install dependencies, run scripts, and identify the entry point to our package.
The package-lock.json
is essentially used to lock dependencies to a specific version number. This file is automatically generated (or re-generated) when there is a change in either the node_modules tree or package. json .
The node_modules
directory stores modules that can be loaded by the Node. js require() function.
Handling Sensitive Data
Data handling is important in ensuring the integrity of research data since it addresses concerns related to confidentially, security, and preservation/retention of research data. Proper planning for data handling can also result in efficient and economical storage, retrieval, and disposal of data.
As a result of this, there's a need to protect the PORT
used in the server.
To achieve this, create a .env
file in MyBlogAPI
folder and type the code below:
.env file
PORT=6060
The .env
helps to hide or keep secret data.
Next, inside app.js
, replace const PORT = 6060
with the code below:
app.js file
const PORT = process.env.PORT
To make .env
effective, we require and configure the package by calling require('dotenv').config()
. So the app.js
file becomes:
app.js file
const express = require('express')
const app = express()
require('dotenv').config()
const PORT = process.env.PORT
app.listen(PORT, () => {
console.log(`Server is running at port http://localhost:${PORT}`)
})
Make sure to save every file created. nodemon will automatically restart the server.
Here's the updated project:-
You should get this response in your terminal as well -
Server is running at port
http://localhost:6060
Next, we are going to hide the node_modules
and .env
files in a file called .gitignore
.
As more dependencies are installed, the node_modules
folder grows even larger (in memory). This, however, shouldn't be committed to GitHub.
.gitignore
tells Git which files to ignore when committing a project to the GitHub repository.
Create a .gitignore
file in the root folder i.e. directly inside MyBlogAPI
folder, and type inside it the code below:
.gitignore file
/node_modules
.env
Schema And Database Integration (with mongoDB)
In this section, we shall be implementing the MongoDB to be integrated with the API project and as such install, a dependency/package/library called mongoose
. This helps to create a basic structure of the document, referred to as schema
. Each schema
maps to a MongoDB collection and defines the shape of the documents within that collection.
Not to worry if you are getting overwhelmed at this point, I will walk you through, step-by-step, the entire process.
Create a folder called model
in the root directory. This model
should contain a file called User.js
, or better still, let's call it User.model.js
for a more descriptive naming, indicating that the file belongs to the model
folder .
We will define the UserSchema
object in this file.
In the User.model.js
file, we import the mongoose
dependency and create the schema
structure. The UserSchema
contains the necessary user data schema - firstname
, lastname
, username
, username
, email
and password
.
User.model.js file
const mongoose = require('mongoose')
const Schema = mongoose.Schema
const UserSchema = new Schema({
firstname: { type: String, required: true },
lastname: { type: String, required: true },
username: { type: String, required: true, unique: true },
email: { type: String, required: true, unique: true },
password: { type: String, required: true }
},
{timestamps: true});
module.exports = mongoose.model("User", UserSchema)
Now, let's integrate the entire process to link the schema to MongoDB.
Just a few more tweaks!
Create a folder called config
in the root directory. This config
should contain a file called db.js
which simply stands for the database. Here, we set up the database connection and export it as a function.
Type the code below inside db.js
.
db.js file
const mongoose = require("mongoose");
require("dotenv").config()
const MONGO_URL = process.env.MONGO_URL
//checking the connection
function connectToMongoDB(){
mongoose.connect(MONGO_URL)
mongoose.connection.on("connected", () => {
console.log("Connected to MongoDB Successfully!");
});
mongoose.connection.on("error", (err) => {
console.log("An error occurred while connecting to MongoDB");
console.log(err);
})
}
module.exports = {
connectToMongoDB
}
In app.js
, we import the db.js
, destructure connectToMongoDB
and call connectToMongoDB
as a function. This connects the app to the database.
Type the code below inside app.js
.
app.js file
const express = require('express')
const app = express()
require('dotenv').config()
const PORT = process.env.PORT
const { connectToMongoDB } = require('./config/db')
// connection to database
connectToMongoDB()
app.listen(PORT, () => {
console.log(`Server is running at port http://localhost:${PORT}`)
})
Launch the mongoDBCompass
we installed earlier, copy the URI - mongodb://localhost:27017
provided.
MongoDBCompass
Next, go to the .env
file and insert the URI - MONGO_URL=mongodb://localhost:27017/MyBlogAPI
, where MyBlogAPI
is the name of the database(you can choose to give it any other name of your choice). This code helps to create MyBlogAPI
's database and collection in the mongoDBCompass
.
.env file
PORT=6060
MONGO_URL=mongodb://localhost:27017/MyBlogAPI
Make sure all files are saved and nodemon is still running the server!
You should now be connected to the database, as confirmed in the terminal Connected to MongoDB Successfully!
. Ignore the deprecation warning shown in the terminal for now.
Users' Sign-up
Installing/Importing Necessary Files and Libraries
In this section, we shall be creating users' registration and login routes. A user will be able to log in after being authenticated and verified.
For this, there's a need to use some authentication libraries. In this article, we shall be using
jsonwebtoken
andcrypto-js
.Let's install them.
In the terminal, type and run the code below:
terminal
npm i jsonwebtoken crypto-js
Then we create a folder in the root directory and name it
controllers
. In this foldercontrollers
, create a file calledauth.controller.js
. By now, you should be familiar with this naming convention.Let's import some files and libraries in the
auth.controller.js
file. We shall be using them to create thesignup
andlogin
functions.auth.controller.js file
const User = require("../model/User.model"); const CryptoJS = require("crypto-js"); const PASSWORD_SECRET_KEY = process.env.PASSWORD_SECRET_KEY const jwt = require('jsonwebtoken'); const JWT_SECRET_TOKEN = process.env.JWT_SECRET_TOKEN
In the code above, we import the necessary file and libraries and set them to their respective variable names, using
const
.Here's an explanation of what each declared variable does.
User
is used in creating a User object from theUser.model.js
. It provides the user, the privilege to fill in their details during registration and login.CryptoJS
is used to encrypt the password given by the user.PASSWORD_SECRET_KEY
is responsible for the hashing action of theCryptoJS
library.JWT_SECRET_TOKEN
generates thejwt
(token).Before proceeding to write the
signup
andlogin
functions in theauth.controller.js
, there's a need to secure thePASSWORD_SECRET_KEY
and theJWT_SECRET_TOKEN
by adding them in the.env
file, then assigning random values to them which will serve as secret keys..env file
PORT=6060 MONGO_URL=mongodb://localhost:27017/MyBlogAPI PASSWORD_SECRET_KEY=abcd1234 JWT_SECRET_TOKEN=1234abcd
The Sign-up Function
In the auth.controller.js
, let's implement the signup
function that will give aspiring users the privilege to register.
auth.controller.js file
const User = require("../model/User.model");
const CryptoJS = require("crypto-js");
const PASSWORD_SECRET_KEY = process.env.PASSWORD_SECRET_KEY
const jwt = require('jsonwebtoken');
const JWT_SECRET_TOKEN = process.env.JWT_SECRET_TOKEN
// sign-up function
const signup = async (req, res)=> {
try {
const encryptedPassword = CryptoJS.AES.encrypt(req.body.password, PASSWORD_SECRET_KEY).toString()
const newUser = new User({
firstname: req.body.firstname,
lastname: req.body.lastname,
username: req.body.username,
email: req.body.email,
password: encryptedPassword,
});
const user = await newUser.save();
res.status(200).json({
"message": "Registered Successfully!",
user
})
} catch (err) {
res.status(500).json(err);
}
}
module.exports = { signup }
The signup
function is an async function that helps to handle operations asynchronously. This permits the await keyword within the function.
The req
and res
parameters passed to the signup
function represents the request of the user and the response sent back to the user respectively.
req
is an object containing information about the HTTP request that raised the event. In response to req
, res
sends back the desired HTTP response.
The firstname
, lastname
, username
, email
and password
are the values passed in by the user. This is done using the req.body
property which contains key-value pairs of data submitted in the request body.
An HTTP 200 OK
success status response code indicates that the request is successful, while an HTTP 500 Internal Server Error
server error response code indicates that the server encountered an unexpected condition that prevented it from fulfilling the request.
Next, let's create a route for the signup
using express.Router()
with POST
method. A route is a URL path or section of an Express code that associates an HTTP verb. HTTP verb defines a set of request methods to indicate the desired action to be performed for a given resource( GET , POST , PUT , DELETE , etc.). The express.Router()
function is used to create a new router object which is used to create a new router object in the program to handle requests.
Let's create a folder in the root directory and name it routes
. In this folder routes
, create a file called auth.route.js
. In the auth.route.js
, we integrate the signup
function which was created earlier in the auth.controller.js
, with the POST
method to create a route called register
. The express.Router()
which is set to a variable called AuthRouter
is then exported for use in other programs by using the module.exports
.
auth.route.js file
const express = require('express');
const AuthRouter = express.Router();
const AuthController = require("../controllers/auth.controller")
// Register
AuthRouter.post("/register", AuthController.signup);
module.exports = AuthRouter
In the app.js
, we import the AuthRouter
and use it as middleware .
app.js file
const authRoute = require("./routes/auth.route")
app.use("/api/auth", authRoute)
const authRoute = require("./routes/auth.route")
imports the auth.route.js
and
app.use("/api/auth", authRoute)
is used to mount authRoute
at /api/auth
path.
app.js file updated!
const express = require('express')
const app = express()
require('dotenv').config()
const PORT = process.env.PORT
const { connectToMongoDB } = require('./config/db')
const authRoute = require("./routes/auth.route")
// connection to database
connectToMongoDB()
// middleware
app.use(express.json())
app.use("/api/auth", authRoute)
app.listen(PORT, () => {
console.log(`Server is running at port http://localhost:${PORT}`)
})
Launch or refresh the MongoDBCompass
software and click on connect
.
MongoDBCompass
On clicking connect
, we got directed to this page,
Now we see the database MyblogAPI
present in the database column by the left.
Click on the dropdown button, you should see users
collection which was created earlier in the User.model.js
file. For now, there's no data in the users
collection.
Registering The User
Go to app.js
and put in this code app.use(express.json())
express.json()
is a built-in middleware function in Express. It parses incoming requests with JSON payloads and is based on body-parser .
Remember the /api/auth
path in app.js
and the register
route in auth.route.js
, they form the users' signup endpoint.
app.js file updated!
const express = require('express')
const app = express()
require('dotenv').config()
const PORT = process.env.PORT
const { connectToMongoDB } = require('./config/db')
const authRoute = require("./routes/auth.route")
// connection to database
connectToMongoDB()
// middleware
app.use(express.json())
app.use("/api/auth", authRoute)
app.listen(PORT, () => {
console.log(`Server is running at port http://localhost:${PORT}`)
})
On the Thunder Client, we insert the endpoint - http://localhost:6060/api/auth/register
and supply the user's registration data on the Body
as defined in the schema.
User's Sign-up Data Sample
{
"firstname": "John",
"lastname": "Doe",
"username": "userJohn",
"email": "userJohn@mail.com",
"password": "Password0!"
}
User Sign-Up (with Thunder Client)
This registers the user as can be seen on the right side of the VS Code. Also, refresh the mongoDBCompass
, we see that the user data has persisted.
MongoDBCompass
Users' Login
In this section, we shall be discussing how to implement a user's login
. Similarly to the signup
implementation, we create a login
function in auth.controller.js
.
Necessary libraries have been installed during signup
implementation.
The Login Function
In the auth.controller.js
, let's implement the login
function that will give aspiring users the privilege to log in to their accounts.
const login = async (req, res) => {
try {
const user = await User.findOne({ username: req.body.username });
!user && res.status(400).json("Wrong username or password!");
const decryptedPassword = CryptoJS.AES.decrypt(user.password, PASSWORD_SECRET_KEY);
const OriginalPassword = decryptedPassword.toString(CryptoJS.enc.Utf8);
OriginalPassword !== req.body.password && res.status(401).json("Wrong login details!");
const accessToken = jwt.sign({
id: user._id
}, JWT_SECRET_TOKEN, {expiresIn: "1h"}
);
// Destructuring the user to send other details except password
const { password, ...other } = user._doc;
res.status(200).json({
"message": "Login Successful!",
...other,
accessToken
});
} catch (err) {
res.status(500).json(err);
}
}
module.exports = { login }
Next, In the auth.route.js
, we integrate the login
function in the auth.controller.js
, with the POST
method to create a route called login
.
We insert this code:
AuthRouter.post("/login", AuthController.login);
auth.route.js file
const express = require('express');
const AuthRouter = express.Router();
const AuthController = require("../controllers/auth.controller")
// Register
AuthRouter.post("/register", AuthController.signup);
// Login
AuthRouter.post("/login", AuthController.login);
module.exports = AuthRouter
auth.controller.js file updated!
const User = require("../model/User.model");
const CryptoJS = require("crypto-js")
const PASSWORD_SECRET_KEY = process.env.PASSWORD_SECRET_KEY
const jwt = require('jsonwebtoken');
const JWT_SECRET_TOKEN = process.env.JWT_SECRET_TOKEN
const signup = async (req, res)=> {
try {
// const salt = await bcrypt.genSalt(10);
// const hashedPassword = await bcrypt.hash(req.body.password, salt)
const encryptedPassword = CryptoJS.AES.encrypt(req.body.password, PASSWORD_SECRET_KEY).toString()
const newUser = new User({
firstname: req.body.firstname,
lastname: req.body.lastname,
username: req.body.username,
email: req.body.email,
password: encryptedPassword,
});
const user = await newUser.save();
res.status(200).json({
"message": "Registered Successfully!",
user
})
} catch (err) {
res.status(500).json(err);
}
}
const login = async (req, res) => {
try {
const user = await User.findOne({ username: req.body.username });
!user && res.status(400).json("Wrong username or password!");
const decryptedPassword = CryptoJS.AES.decrypt(user.password, PASSWORD_SECRET_KEY);
const OriginalPassword = decryptedPassword.toString(CryptoJS.enc.Utf8);
OriginalPassword !== req.body.password && res.status(401).json("Wrong login details!");
const accessToken = jwt.sign({
id: user._id
}, JWT_SECRET_TOKEN, {expiresIn: "1h"}
);
// Destructuring the user to send other details except password
const { password, ...other } = user._doc;
res.status(200).json({
"message": "Login Successful!",
...other,
accessToken
});
} catch (err) {
res.status(500).json(err);
}
}
module.exports = {
login,
signup
}
The findOne() function is a mongoose
model used to find one document according to the condition. If multiple documents match the condition, then it returns the first document satisfying the condition.
Here, we supply a username
and a corresponding password
used during registration, if found, the password
is compared with the encrypted password
supplied during registration to find a match. If there's a match, a secret token is generated by jwt
with a duration of one(1) hour. At token expiration, the user
is forcefully logged out of the application. The user
, therefore, will be required to login
again to have access.
Here, the /api/auth
path in app.js
and the login
route in auth.route.js
form the users' login endpoint.
On the Thunder Client, we insert the endpoint - http://localhost:6060/api/auth/login
and supply the user's registration data on the Body
as defined in the schema, using the POST
method.
User Access
User's Login Data Sample
{
"username": "userJohn",
"password": "Password0!"
}
User Login (with Thunder Client)
Notice the accessToken
generated. That's JWT_SECRET_TOKEN
in action.
That's it for the Users' Sign-up and Login.
Before we move to the next section, let's do a quick update.
In package.json
, replace index.js
in value with app.js
.
Also, add "dev": "nodemon app.js"
to the scripts
.
So, in the terminal, we can run npm run dev
instead of nodemon app.js
for running the server.
package.json file
{
"name": "myblogapi",
"version": "1.0.0",
"description": "",
"main": "app.js",
"scripts": {
"dev": "nodemon app.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"body-parser": "^1.20.1",
"crypto-js": "^4.1.1",
"dotenv": "^16.0.3",
"express": "^4.18.2",
"jsonwebtoken": "^9.0.0",
"mongoose": "^6.8.2"
},
"devDependencies": {
"nodemon": "^2.0.20"
}
}
Posts' Routes
Having implemented the User signup
and login
, we can then implement registered users' posts. Here, users have the privilege to post content on their blogs.
Similarly to the Users' signup
route, we define a PostSchema
object in a file located in the model
folder. Let's call the file Post.model.js
.
The PostSchema
contains the necessary user data schema - title, lastname
, tags
, state
, read_count
, reading_time
, body
and author
.
state
defines whether a post is in draft
or published
state. By default, the state is in draft
.
read_count
indicates how many times the post was read.
reading_time
tells how
Post.model.js file
const mongoose = require('mongoose')
const Schema = mongoose.Schema
const PostSchema = new Schema({
title: { type: String, required: true, unique: true },
description: { type: String },
tags: { type: Array },
state: { type: String, enum: ['draft', 'published'], default: 'draft'},
read_count: { type: Number, default: 0 },
reading_time: { type: Number },
body: { type: String, required: true },
author: {
type: Schema.Types.ObjectId,
ref: "User"
}
}, {timestamps: true});
// compiling the model
module.exports = mongoose.model("Post", PostSchema)
Posts' Function
Create a file in the controllers
folder, call it post.controller.js
, let's implement the post
function that will give users the privilege to create, read, search, update and delete a post. We refer to these operations as CRUD
. Note that, only authorized users have certain privileges like updating and deleting a post. Along with this, we implement the reading_time
algorithm.
Create a folder in the root directory, call it utils , then create a file called readTime.js
.
readTime.js file
// An algorithm that calculates an estimated time to read a blog
const readTime = (post) => {
const numWords = post.split(" ").length
const wpm = numWords/200
return Math.round(wpm) === 0 ? 1 : Math.round(wpm)
}
module.exports = readTime;
This algorithm counts the number of words in the body
and divides it by 200, this result is rounded up to the nearest whole number. This gets exported and then imported in the post.controller.js
.
post.controller.js file
const readTime = require("../utils/readTime")
const Post = require("../model/Post.model");
// Create Post
const createPost = async (req, res) => {
const newPost = new Post(req.body);
try {
newPost.reading_time = readTime(newPost.body)
await newPost.save();
res.status(200).json({
message: "Blog created Successfully!",
newPost
});
} catch (err) {
res.status(500).json(err);
}
}
// Get Post by ID
const getPost = async (req, res) => {
const post = await Post.findById(req.params.id).populate("author");
// console.log(post)
if(!post){
return res.status(404).send('Post not found!')
}
post.reading_time = readTime(post.body)
post.read_count += 1;
await post.save();
res.status(200).json(post);
}
// Get All Posts or Search by Title, author or tag
const getAllPosts = async (req, res) => {
const title = req.query.title;
const author = req.query.author;
const tag = req.query.tag;
const state = req.query.state;
const { page=1, limit=20 } = req.query;
try{
let posts;
if(title){
posts = await Post.find({ title });
} else if(author){
posts = await Post.find({ author }).sort({ _id: -1 }).limit(limit*1).skip((page-1)*limit).exec();
} else if(state){
posts = await Post.find({ state }).sort({ _id: -1 }).limit(limit*1).skip((page-1)*limit).exec();
} else if(tag){
posts = await Post.find({
tags: {
$in: [tag],
},
}).sort({ _id: -1 }).limit(limit*1).skip((page-1)*limit).exec();
} else {
posts = await Post.find().sort({ _id: -1 }).limit(limit*1).skip((page-1)*limit).exec();
}
const count = await Post.countDocuments();
res.status(200).json({
message: "Successful!",
posts,
totalPages: Math.ceil(count / limit),
currentPage: page
});
} catch(err){
res.status(500).json(err);
}
}
// Update A Post
const updatePost = async (req, res)=> {
try{
const updatedPost = await Post.findByIdAndUpdate(req.params.id,{
$set: req.body
},
{new: true}
);
res.status(200).json({
message: "Post has been updated!",
updatedPost
});
} catch (err) {
res.status(500).json(err);
}
}
// Delete A Post
const deletePost = async (req, res)=> {
try{
await Post.findByIdAndDelete(req.params.id);
res.status(200).json("Post has been deleted!");
} catch (err) {
res.status(500).json(err);
}
};
module.exports = {
createPost,
getPost,
getAllPosts,
updatePost,
deletePost
}
The CRUD
functions are all async functions that help to handle operations asynchronously. This permits the await keyword within the function.
The create
function sends a post from the user as a request req
and saves it in the database.
The getPost
function receives specific data from a post in the database by using the id
generated in the post. populate
function is a mongoose property used for populating the data inside the reference. Meaning, when a post is requested, both the post data and the author's details get returned.
The getAllPost
gives the user the privilege to find, search or filter posts by author
, or by title
, or by tag
. The result is paginated and limited to 20 blogs per page. Posts can also be queried by state
.
The updatePost
function uses the id
to find its corresponding post in the database, if found, it updates the post based on the user's discretion.
The deletePost
function also uses the id
to find its corresponding post in the database, if found, it simply removes the posts from the database.
We need to also create a file for the published posts to be located in the controllers
folder. This file contains a similar implementation as in the post.controller.js
.
publishedPosts.controller.js file
const readTime = require("../utils/readTime")
const Post = require("../model/Post.model");
// Get All Published Posts or Search by Title, author or tag
const publishedPosts = async (req, res) => {
const title = req.query.title;
const author = req.query.author;
const tag = req.query.tag;
let sort = req.query.sort
const { page=1, limit=20 } = req.query;
try{
const ValidSort = ['read_count', 'reading_time', 'timestamp'].includes(sort)
ValidSort ? null : sort = 'read_count'
let posts;
if(title){
posts = await Post.find({ title });
} else if(author){
posts = await Post.find({ author }).find({ state: "published"}).sort({ _id: -1 }).limit(limit*1).skip((page-1)*limit).exec();
} else if(tag){
posts = await Post.find({
tags: {
$in: [tag],
},
}).find({ state: "published" }).sort({ _id: -1 }).limit(limit*1).skip((page-1)*limit).exec();
} else {
posts = await Post.find({ state: "published"}).sort({ [sort]: -1 }).limit(limit*1).skip((page-1)*limit).exec();
}
const count = await Post.countDocuments();
res.status(200).json({
posts,
totalPages: Math.ceil(count / limit),
currentPage: page
});
} catch(err){
res.status(500).json({
error: "Sorry, No published Posts yet!"
});
}
}
module.exports = { publishedPosts }
Next, we create two(2) files in the routes
folder - the posts.route.js
and the publishedPosts.route.js
.
posts.route.js file
const express = require('express');
const postRouter = express.Router();
const PostController = require("../controllers/post.controller");
// Create Post
postRouter.post("/", PostController.createPost);
// Get Post by ID
postRouter.get("/:id", PostController.getPost);
// Get All Posts or Search by Title, author or tag
postRouter.get("/", PostController.getAllPosts);
// Update A Post
postRouter.put("/:id", PostController.updatePost);
// Delete A Post
postRouter.delete("/:id", PostController.deletePost);
module.exports = postRouter
The createPost
function which was created earlier in the post.controller.js
, is integrated into the postRouter using the POST
method.
The getPost
function is integrated into the postRouter
using the GET
method.
The getAllPosts
is integrated into the postRouter
using the GET
method.
The updatePost
is integrated into the postRouter
using the PUT
method.
The deletePost
is integrated into the postRouter
using the DELETE
method.
We export the postRouter
router.
publishedPosts.route.js file
const express = require('express')
const publishedPostRouter = express.Router();
const PublishedPostController = require("../controllers/publishedPosts.controller")
// Get All Published Posts or Search by Title, author or tag
publishedPostRouter.get("/", PublishedPostController.publishedPosts);
module.exports = publishedPostRouter
The publishedPosts
function is integrated into the publishedPostRouter
using the GET
method.
We export the publishedPostRouter
router.
Let's create a folder in the root directory and name it routes
. In this folder routes
, create a file called auth.route.js
. In the auth.route.js
, we integrate the signup
function which was created earlier in the auth.controller.js
, with the POST
method to create a route called register
. The express.Router()
which is set to a variable called AuthRouter
is then exported for use in other programs by using the module.exports
.
In the app.js
file , we import both postRouter
and publishedPostRouter
then use them as middleware .
app.js file
const express = require('express')
const app = express()
require('dotenv').config()
const PORT = process.env.PORT
const { connectToMongoDB } = require('./config/db')
const authRoute = require("./routes/auth.route")
const postRoute = require("./routes/posts.route")
const publishedPostRoute = require("./routes/publishedPosts.route")
// connection to database
connectToMongoDB()
// middleware
app.use(express.json())
app.use("/api/auth", authRoute)
app.use("/api/posts", postRoute)
app.use("/api/publishedPosts", publishedPostRoute)
app.listen(PORT, () => {
console.log(`Server is running at port http://localhost:${PORT}`)
})
User Verification/Authentication
This is the process of confirming a user's identity by obtaining credentials and using those credentials to validate their identity.
In this article, we are going to use the jwt
token to verify logged-in users to have the CRUD
privileges.
In the root folder, we create a folder called middleware
. In this folder, create a file called verifyBearerToken
. This file contains a verifyToken
function which grabs the token provided in the Thunder Client's Http Headers
. This file is exported, then imported in the posts.route.js
and used as a middleware.
verifyBearerToken file
const jwt = require('jsonwebtoken');
const JWT_SECRET_TOKEN = process.env.JWT_SECRET_TOKEN;
const verifyToken = (req, res, next)=>{
const authHeader = req.headers.token
if(authHeader){
const token = authHeader.split(" ")[1];
jwt.verify(token, JWT_SECRET_TOKEN, (err, user)=>{
if(user){
req.user = user;
next();
} else if(err) {
res.status(403).json("Invalid Token!");
}
})
} else {
return res.status(401).json("You are not authenticated!")
}
};
module.exports = {
verifyToken,
};
posts.route.js file
const express = require('express');
const postRouter = express.Router();
const PostController = require("../controllers/post.controller");
const { verifyToken } = require('../middleware/verifyBearerToken');
// Create Post
postRouter.post("/", verifyToken, PostController.createPost);
// Get Post by ID
postRouter.get("/:id", verifyToken, PostController.getPost);
// Get All Posts or Search by Title, author or tag
postRouter.get("/", verifyToken, PostController.getAllPosts);
// Update A Post
postRouter.put("/:id", verifyToken, PostController.updatePost);
// Delete A Post
postRouter.delete("/:id", verifyToken, PostController.deletePost);
module.exports = postRouter
Performing CRUD
Let's login
with the data provided earlier, using the Thunder Client.
login endpoint - http://localhost:6060/api/auth/login
{
"username": "userJohn",
"password": "Password0!"
}
Before performing any CRUD
operation, let's ensure to copy the accessToken
generated during login
, and paste it in the Headers
token, attached to the Bearer
word separated with a space. Note that accessToken
gets expired one(1) hour after the user logs in. This means the user gets logged out of the account automatically. Users would be required to log in again to have access.
See the image below:
Create a post
Method - POST
Route- http://localhost:6060/api/posts
In Thunder Client, under the Body
, select Json
, paste the data below
{
"title": "qui est esse",
"author": "6365edb2e4364c1971dc72b3",
"tags": ["Anker2", "Soundcore2"],
"body": "est rerum tempore vitae\nsequi sint nihil reprehenderit dolor est rerum tempore vitae\nsequi sint nihil reprehenderit dolor est rerum tempore vitae\nsequi sint nihil ..."
}
Get post by ID
Method: GET
Route: http://localhost:6060/api/posts/:id
Here, we copy the id
generated when the post is created, and replace it with the :id
in the route. This becomes
http://localhost:6060/api/posts/63bdc533c5bae085a4fc1394
Other Routes and their methods are shown below.
Get All Posts
Method: GET
Route: http://localhost:6060/api/posts
Get All Published Posts
Method: GET
Route: http://localhost:6060/api/publishedPosts
Get All Published Posts by Title, Author or Tag
Method: GET
Route: http://localhost:6060/api/publishedPosts?title=yourTitle
To search by Author
or Tag
, just replace the title
in the Route.
Update A Post
Method: PUT
Route: http://localhost:6060/api/posts:id
Delete A Post
Method: DELETE
Route: http://localhost:6060/api/posts:id
Data Validation
Data validation serves as a form of protection that filters out invalid data sent by the user. It ensures that the data received is in the proper format. In order to mitigate invalid or unwanted data from the user, we can set up filters that analyze each and every data sent by the user during signup and creating posts. This can be done manually or by using a validation library. In this article, we shall be using the joi
validation library.
In the root folder, create a folder called validators
. In this folder, create a file - posts.validator.js
. The posts.validator.js
validates posts made by the user
.
Let's install the joi
library.
npm i joi
posts.validator.js file
const Joi = require("joi");
const PostSchema = Joi.object({
title: Joi.string().min(3).max(255).trim().required(),
description: Joi.string().min(3).max(500).optional().trim(),
tags: Joi.array().items(Joi.string()).optional(),
state: Joi.string(),
read_count: Joi.number(),
reading_time: Joi.number(),
body: Joi.string().required(),
author: Joi.string(),
createdAt: Joi.date().default(Date.now),
lastUpdateAt: Joi.date().default(Date.now),
});
const postUpdateSchema = Joi.object({
title: Joi.string().min(1).max(255).trim(),
description: Joi.string().min(1).max(500).trim(),
tags: Joi.array().items(Joi.string()),
state: Joi.string(),
read_count: Joi.number(),
reading_time: Joi.number(),
body: Joi.string().required(),
});
async function postValidationMiddleWare(req, res, next) {
const postPayload = req.body;
try {
await PostSchema.validateAsync(postPayload);
next();
} catch (error) {
next({
message: error.details[0].message,
status: 400,
});
}
}
async function updatePostValidationMiddleware(req, res, next) {
const postPayLoad = req.body
try {
await postUpdateSchema.validateAsync(postPayLoad)
next()
} catch (error) {
next({
message: error.details[0].message,
status: 400
})
}
}
module.exports = {
postValidationMiddleWare,
updatePostValidationMiddleware,
}
In the code above, we see that postValidationMiddleWare
and updatePostValidationMiddleware
functions are both exported. These functions will be imported in post.route.js
and used as a middlewares to validate the createPost
and the updatePost
respectively.
post.route.js file
const express = require('express');
const postRouter = express.Router();
const PostController = require("../controllers/post.controller");
const { verifyToken } = require('../middleware/verifyBearerToken');
const {
postValidationMiddleWare,
updatePostValidationMiddleware
} = require('../validators/posts.validator')
// Create Post
postRouter.post("/", verifyToken, postValidationMiddleWare, PostController.createPost);
// Get Post by ID
postRouter.get("/:id", verifyToken, PostController.getPost);
// Get All Posts or Search by Title, author or tag
postRouter.get("/", verifyToken, PostController.getAllPosts);
// Update A Post
postRouter.put("/:id", verifyToken, updatePostValidationMiddleware, PostController.updatePost);
// Delete A Post
postRouter.delete("/:id", verifyToken, PostController.deletePost);
module.exports = postRouter
This handles invalid data.
Deployment
Deployed on Cyclic. Depending on individual preferences or professional advice, other cloud-hosting platforms such as Render, Heroku, etc. can be used for deployment.
Conclusion
This article covers server creation with MongoDB integration. Users are given the privilege to create, read, update, and delete their posts.