Photo by Jefferson Santos on Unsplash

DynamoDB is a NoSQL database hosted on the cloud and managed by AWS. At its core, DynamoDB is built around the concepts of the document/key-value store.

It almost needs zero setup and it can be scaled linearly and provides a single millisecond response for any large amount of data. In this blog post, I will show how to get started using the AWS console.

Before getting into the AWS console, we need to clear a few high-level concepts and I will use SQL to clarify the same.

Open search is a drop-in replacement for elastic search. Open-sourced by AWS and recently got an RC.

Getting Started

To start, let’s use the official Docker compose file. If you are new to Docker, then go to the folder where you have downloaded the file and start the services using

docker-compose up

You can read more about the docker-compose here.


Open search needs more virtual memory than allocated by the default docker-machine. So you can increase to a higher number using the following commands in windows

Once it is done then you can browse for the dashboard at http://localhost:5601/ and login…

Tokenization plays an essential role in NLP as it helps convert the text to numbers which deep learning models can use for processing.

No deep learning models can work directly with the text. You need to convert it into numbers or the format which the model can understand.

Bert is based on transformer architecture and currently one of the best in the field of NLP. It uses the Subword tokenization method for tokenizing the text.

This blog post will learn about the subword tokenization method and the words that Bert algorithm knows.

PreTrained Model

Orginal Bert model is already trained by google…

Hugging Face is an NLP library based on deep learning models called Transformers. We will be using the library to do the sentiment analysis with just a few lines of code.

In this blog post, we will use the pre-trained model or the shelf model.

You can install the library using Pip, Which is similar to Nuget, NPM, and cargo.

How it Works

Once you have installed the library, You need to create the pipeline. For Eg, if you want a sentiment analysis pipeline.

Similarly, you can create for

We have looked at how to prepare the data for the sentiment analysis in the previous blog post. As a continuation, we will look into how to create a Sentiment analysis with a basic dictionary.

Yes, You read it right. In fact 5 years back this method has been used in Gmail to filter the emails as spam or not Spam.

Let start with a first principle(fancy word). We have a sentence like

We can split the sentences by words and try to see how many positive words and negative words and if the Positive words are more than…

Sentiment analysis plays a significant role in marketing. In this project, I try to solve the automation of analyzing millions of reviews in the market. I am picking fictional video game review data from the manning live project to show how it’s done.

This project’s primary goal is to understand NLP using deep learning and a complete lifecycle of developing applications.


Let us start with the first step in the project by downloading data from the web.

Downloaded data is moved into the Data folder. Video_Games_5.json.gz is gzipped file, and we can open the file using gunzip command in…

Be the beginner or Practitioner of deep learning. Setting up the workstation is a daunting choice. More than daunting, it can put you out of interest in starting your learning path.

One more significant issue with the workstation is costly, and it’s challenging to choose one type of GPU or library.

Jarvis Labs has come up with a Simple and Affordable solution to democratize AI. They offer you a GPU-enabled machine with all the libraries preconfigured.

Well, what drove me to the platform is just a single click away from a powerful machine. So start with the click on

Training deep learning models is time-consuming, and you can easily spend a day on just that. So most of the models are trained, and the best practice in this area is to train using GPU.

I personally prefer using Pytorch for deep learning, and in this blog post, I would like to provide helper functions that I use for training the same.

Get Default device on the Machine

To get the Default device on the device. Sometimes I use my personal laptop to train the model, which doesn’t have GPU then; this function allows me to seamlessly switch based on where the model is trained.

This series is based on Raw coding, and it can be considered a written version of the video.

Let’s start with an empty Core application with a Home controller and a couple of actions(Index, Secret). As the name suggests, I want to protect the remote endpoint, and a simple way to do the same is the [Authorize] attribute.

So if you run the application you will be able to bring both the Home view and Secret. So how do we stop the secret view?


To make the authorized attribute work you need to add the relevant middleware and the magic happens in the Startup class.


After adding the change and when you try to…

System.Threading.Task is one of the de facto best practices for .net programming for async programming for all these years. C# 7.0 has added a cherry on top of it with Value Task for optimizing performance and abstraction.

Performance benefit

To illustrate the benefit in performance, Let’s call an API to return the restaurant data for a particular city and cache the result. I have used caching here to show that there is a mix of synchronous and asynchronous code.

I ran the benchmark test with the Value Task and Task, Here is what the benefit.

As per the result, we are…


I build intelligent Web Apps

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store