Understanding durability & write safety in MongoDB

Durability is the “D” in the “ACID” properties popularized by traditional RDBMS. Durability is the guarantee that written data has been saved and will survive permanently. NoSQL databases like MongoDB give developers fine grained control over the durability of their write calls. This enables developers to choose different durability, safety and performance models for different classes of data. However this also places the burden on the developer to discern and understand the nuances of the different write safety options. In this blog post we will look at the different options for write safety provided in the Java driver. In MongoDB parlance this is called “Write Concern”. Write concerns vary from “weak” to “strong”. Weak writes concerns can lead to higher throughput but provide less data safety and strong write concerns are vice versa.

Continue reading

How to find a needle in a haystack?

 Needle In A Haystack Loupe DrawingThe poster child scenario for big data – you need to sift through a large amount of data to extract a tiny “nugget” of information. Also you need to do it in as short a amount of time as possible, your business depends on it. Historically using traditional RDBMS technology this sort of scenario has required a large team and a large investment of time and money. Most traditional RDBMS’s only scale vertically, so you have to keep buying larger and larger machines to reduce your turnaround time. The advent of public clouds and NoSQL databases like MongoDB has completely disrupted how teams are thinking about this scenario.

Continue reading

Implementing pagination with MongoDB, Express.js & Slush

MongoDB accepts and provides access to data in the Javascript Object notation (JSON) format. This makes MongoDB a perfect fit when dealing with javascript based REST services. In this post, we will take a look at Pagination using MongoDB. We will scaffold a simple Express/Mongojs application using slush-mongo. Then we will use skip() and limit() to fetch the required records from a set of data.

Pagination is one of the simplest ways to increase UX when dealing with average to huge data sets. We split the entire data into x records per page and the we will have (total records/x) pages and then we show a pagination with the number of page. As the user clicks on the page number, we seek and fetch the set of records for that particular view only.

Pagination

You can find a live demo of the app here and the complete code for this app here.

Setup the Project

Create a new folder named mongoDBPagination. Open terminal/prompt here. Next, we will install gulp, slush and slush-mongo modules. Run


$ [sudo] npm i -g gulp slush slush-mongo

Once this is done, run


$ slush mongo

You will be asked a few questions and you can answer it as follows


[?] Which MongoDB project would you like to generate? Mongojs/Express
[?] What is the name of your app? mongoDBPagination
[?] Database Name: myDb
[?] Database Host: localhost
[?] Database User:
[?] Database Password:
[?] Database Port: 27017
[?] Will you be using heroku? (Y/n) n

This will scaffold a simple Express/Mongojs app for us. Once the installation is done, run


$ gulp

Then open http://localhost:3000 in your favorite browser and you should see a table with list of routes configured in the application. This confirms that you have installed everything correctly.

Setup Test DB

We will create a new collection named ‘testData‘ and then populate some test data in it. Then we will show this data in a paginated table. Open a new Terminal/prompt and run


$ mongo

Then run


use myDb

to select our DB. Next copy the snippet below and paste it in the mongo shell and hit return.

for(var i = 1; i <= 999; i++) {
 db.testData.insert({

 name: Math.random()
           .toString(36)
           .substring(7),

 age: Math.floor(Math.random() * 99),

 random: Math.random()
             .toString(36)
             .substring(7)
 });
}

Continue reading

The three A’s of MongoDB security – Authentication, Authorization & Auditing

The three A's of Mongodb security - Authentication, Authorization and Auditing

MongoDB, Inc has made impressive strides over the past 18 months. One of the areas of the product that has seen the most significant improvement has been the area of Security. Security is of paramout importance for a production database. Existing relational databases provides a  number of knobs and controls to help the DB administrator manage the security of his database and MongoDB is getting to a similar place as well. In this post we will delve deeper into the security features in the areas of Authentication, Authorization & Auditing.

Continue reading

MongoDB shards and unbalanced aggregation loads

The aggregation framework is a vital cog in the mongodb infrastructure. It helps you analyze, summarize and aggregate the data stored in mongodb. Refer to this blog post for more details about the aggregation framework in MongoDB 2.6.

In the 2.6 release MongoDB made a subtle but significant change in the way the underlying aggregation pipelines execute in a sharded environment. When working with sharded collections MongoDB splits the pipeline into two stages. The first stage or the “$match” phase runs on each shard and selects the relevant documents. If the query planner determines that a shard is not relevant based on the shard keys then this phase is not executed on that shard.

Continue reading

Yeoman, Mongoose and MongoDB

In our previous post we talked about getting started with Mongoose and MongoDB. In this post, we will see how to use Yeoman and scaffold a new Mongoose/Express project.

Yeoman is a scaffolding tool, that scaffolds out projects using Grunt, Bower and Node. There are times when you end up cut ‘n pasting boilerplate code around to create a new project. This is precisely what Yeoman does, but with a single command and a few awesome generators.

Yeoman uses Grunt as the taskrunner to perform run/build/test tasks. If you want to use Gulp for the same, you can checkout Slush. Slush is also a Scaffolding tool but uses Gulp as the taskrunner.

Getting Started with Yeoman

To make our lives easy, we will be using a Super Awesome Yeoman Generator named generator-mongoose, which will help us in setting up a new project as well as help us in scaffolding schemas.

This generator uses Express js as the server, HTML for templating and a tinge of Bootstrap CSS to make things look good.

Let’s create a new folder and name it yoMongoose. CD into the folder and run the following :
To install Yeoman

[sudo] npm install -g yo

To install generator-mongoose

[sudo] npm install -g generator-mongoose

and finally run

yo mongoose

to scaffold a new project. Fill in the question like

[?] Database Name: (myDb) myTestDB
[?] Database Host: (localhost) localhost
[?] Database User: {hit return}
[?] Database Password: {hit return}
[?] Database Port: (27017) 27017
[?] Will you be using heroku? (Y/n)  n

And yeoman will go off and scaffold a new project for you. Your folder structure should consist of a /node_modules folder and a public/bower_components. If you do not see either of them, please run npm install and bower install.

To run the app, execute

grunt

This will start off the express server and launch the home page in your default browser. The default page you see is a list of routes configured in the application.

Back to the folder and let’s have a quick walkthrough of the app.

config/db.js – consist of the DB configs and some options you can mess around with

models/post.js – is an example schema of a blog post. All the other models, which we are going to scaffold with the sub generator will appear here.

public/ – consist of the Javascript and CSS needed for the UI

routes/
index.js – consist of the default route, that will dispatch the index.html
post.js – consist of 5 key endpoints you need to interact with the posts collection

test/ – consists of the test for Post route and its methods

views/ – consists of all the templates & views sent to the client.

I recommend taking a peek at the following in order

config/db.js
models/post.js
routes/post.js
app.js

to get a feel of where things go in a modular Express app. Once you are done, we will scaffold another model named articles using the sub generator.

Back to terminal/prompt and run

yo mongoose:schema "article|title:String,excerpt:String,content:String,published:Boolean,created:Date"

the above command will result in

Your creating a schema for article
With the fields: title,excerpt,content,published,created
starting request to schematic for test mock data...
create routes/article.js
create models/article.js
create test/test-article.js

Continue reading

Getting started with MongoDB and Mongoose

What is Mongoose?

Mongoose is an “elegant mongodb object modeling for node.js“. If you have used MongoDB before and tried basic database operations, you might have noticed that MongoDB is  “schema less”. When you are looking to implement a more structured database and want to leverage the power of MongoDB, Mongoose is one of the ODM (Object Data Mapping) solutions.

To quickly demonstrate, you run an insert command into a collection named users like


db.users.insert({ name : 'Arvind', gender : 'male'});

And right after that you can run


db.users.insert({ name : 'Arvind', gender : 'male', password : '!@#$'});

and MongoDB will never complain about the variation in the number of columns (key value pairs). This is very flexible. But when you want to keep your data more organized and structured, you would need to maintain that in your server code, writing validation, making sure nothing irrelevant is stored in a collection. And this is where Mongoose makes life easy.

“Mongoose provides a straight-forward, schema-based solution to modeling your application data and includes built-in type casting, validation, query building, business logic hooks and more, out of the box.”

Install Node js & MongoDB

To use Mongoose, we need to have Node js installed, you can find info here.

Start Developing

Let us first create a small playground, where we can have fun. Create a new folder named myMongooseApp. And open terminal/prompt here and run

npm init

This will help us in initializing a new node project. Fill it up as required. Next, we will install Mongoose as a dependency to our project. Run

npm install mongoose --save-dev

then start the MongoDB service by running

mongod

Next, create a new file named index.js at the root of the and then open it up in your favorite editor. And add the below code.

var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/myTestDB');

var db = mongoose.connection;

db.on('error', function (err) {
console.log('connection error', err);
});
db.once('open', function () {
console.log('connected.');
});

Here, we require the mongoose package to connect to the DB, and initialize the connection. The name of our Database is myTestDB.

Then run

node index.js

and you should see the connected message. You can also use a node package named nodemon for automatically restarting the node server on changes.
Now, our sandbox is ready to play!

Mongoose Schemas

Schemas are like skeletons. The bare bones of how your data collection will look like. If you are dealing with a collection of users, your schema would look something like this.

Name - String
Age - Number
Gender - String
Date of Birth - Date

Continue reading

MongoDB on AWS: How to choose the right EC2 instance type for your MongoDB server?

MongoDB on AWS Lets face it. AWS has gotten incredibly complicated. A simple task like picking the right instance type for your MongoDB server requires a fair bit of research. How do you know which server type to choose in the alphabet soup of options? In this blog post we will break down the different instance types and how they are applicable to your MongoDB scenarios. In order to keep things simple we are not going to talk about disk types or sizes in this post – that’s the topic of our next post.

Continue reading

MongoDB 2.6 Aggregation framework improvements

This a guest post by Vlad Mihalcea. Vlad is a software architect passionate about software integration, high scalability and concurrency challenges. Here is a link to the original post.

MongoDB is evolving rapidly. The 2.2 version introduced the aggregation framework as an alternative to the Map-Reduce query model. Generating aggregated reports is a recurrent requirement for enterprise systems and MongoDB shines in this regard. If you’re new to it you might want to check this aggregation framework introductionor the performance tuning and the data modelling guides.

Let’s reuse the data model I first introduced while demonstrating the blazing fast MongoDB insert capabilities:

{
        "_id" : ObjectId("5298a5a03b3f4220588fe57c"),
        "created_on" : ISODate("2012-04-22T01:09:53Z"),
        "value" : 0.1647851116706831
}

MongoDB 2.6 Aggregation enhancements

In the 2.4 version, if I run the following aggregation query:

db.randomData.aggregate( [
{
    $match: {
        "created_on" : {
            $gte : new Date(Date.UTC(2012, 0, 1)),
            $lte : new Date(Date.UTC(2012, 0, 10))
        }
    }
},
{
    $group: {
        _id : {
            "minute" : {
                $minute : "$created_on"
            }
        },
        "values": {
            $addToSet: "$value"
        }
    }
}]);

Continue reading

Configuring MongoDirector permissions on AWS using an IAM policy template

MongoDirector supports the ability to manage your MongoDB clusters in your AWS account. This model has several advantages as outlined in this blog post. In order to manage mongodb clusters in your own AWS account MongoDirector requires certain permissions. Our recommendation is to restrict the permissions so that you give MongoDirector enough permissions to manage your MongoDB servers and nothing more. This can be done by configuring a custom Identity and Access management (IAM) policy for the AWS keys that you input into MongoDirector. MongoDirector provides two types of IAM policies

Continue reading