Displaying Listings Similar to Search Results

Searching for similar things/places based on some parameters is quite normal in the current online world. Therefore, displaying listings that are similar to what users are looking for is a must for businesses and throws a challenge for developers. Here, I have explained a scenario taking Real Estate web application as an example.

Example

A real estate web application has multiple property listings where a user will search for one property and would like to explore similar properties in that particular area. In this scenario, we need to display all the properties based on Geo Location and Zip Code.

Prerequisites

 Save the Geo-Location (Latitude and Longitude) and Zip Code of the Property while it is listing.

Property Listing and Validation

Step 1:

Before updating any property, we should get inputs from user:

1) Street Line 1

2) Street Line 2

3) City

4) State

5) Country

6) Zip Code

Step 2:

Get Geo Location (i.e. Latitude and Longitude)

a) By Address:

//calling Google maps API for fetching Geo-Location Based on address

"https://maps.googleapis.com/maps/api/geocode/json?address=" + <FullAddress> + "&sensor=true&key=" + <GoogleApiKey>

Similar Search Results

b) By Zip Code and Country Code:

//calling Google maps API for fetching Geo Location Based on Country Code and Zip code

"https://maps.googleapis.com/maps/api/geocode/json?components=country:" + <Country> + "|postal_code:" +  <pincode> + "&sensor=true&key=" + <GoogleApiKey>;

Step 3:

Validating Zip Code:

The two Geo Locations needs to be compared by taking approximate round off values, and if both of them matches then we allow a user to move to the next step else we should clear zip code field and give an alert message like ‘wrong zip code entered’.

Step 4:

Save the Latitude and Longitude into the Address Table.

 

Implementation

To display similar listings when a user clicks on similar property option, the below process needs to be followed:

Step 1:

We will get below input parameters of the clicked property listing:

1) Latitude

2) Longitude

3) Price Range (Min and Max)

4) Zip Code

Step 2: (Optional based on application requirement)

In addition to Step 1, we should add two more input parameters

1) Distance Unit ( Constant  : 111.045 km per degree & 69 statute miles per degree or 60 nautical miles per degree & 552 furlongs per degree)  ~ 12.92297284

2) Radius (Km Radius) ~ 05.0

Step 3:

By gathering input parameters from step 1 & 2, call custom API, get dataset and bind back to the UI.

The Custom API is responsible for below operations

1) Taking Input parameters

2) Querying Database and fetching data from property table based on the input parameters

3) Sending the result dataset back to the client

Result

End user will be able to see various Similar Property Listings which falls under below criteria of the clicked property

1) Within Price Range (Min & Max)

2) Within 5 Km Radius of the Geo Location (Latitude and Longitude)

3) With Same Zip Code

Conclusion

The above process can be used for various applications to implement similar search. However, the search criteria can be changed based on the requirement.

FacebookTwitterGoogle+Share
Sending Exception Detail as Email through AWS SES

Amazon Simple Email Service (Amazon SES) is a highly scalable and cost-effective email service for developers and businesses. Amazon SES eliminates the complexity and expenditure of building an in-house email solution or licensing, installing, and operating a third-party email service for this type of email communication. In addition, the service integrates with other AWS services, making it easy to send emails from applications being hosted on AWS.

Prerequisites

  1. AWS SES
  2. AWS SNS
  3. AWS S3
  4. AWS Lambda

Requirement

When a user uploads an excel file to web console, each row data from the excel file should save into the database. If this process got failed due to some exception, the exception error message must go to the configured email Id’s.

Workflow

We are using two lambda functions. One is used for uploading excel sheet to s3 bucket and another for sending exceptional email to administrator.

When we are uploading excel sheet to s3 bucket, if any failure or exception occurs in that time, Excel upload lambda function will send the error result to SNS (Simple Notification Service) and Exception email lambda function will trigger the SNS and send an exceptional email to the administrator by using SES.

Solution

Setting up Prerequisites

1) AWS SNS

Create a SNS Topic by following below steps,

  1. Open AWS account à choose services as SNS
  2. Click on Create Topic à give the Topic name and Display Name
  3. Click on Create topic

2) AWS S3 Bucket

Create an AWS S3 Bucket with add permission and enable CORS configuration.

3) AWS Lambda

Two Lambda functions are required:

i) For reading excel file and saving each row data into concerned database

a)   Create the lambda function using below code:

var s3file = s3.getObject(params,function (err,data){

if(err){

result.message = 'error while getting ' + key + 'from' + bucketName +' bucket name';

result.describtion = err;

context.fail(result);

}else{

var wstream = fs.createWriteStream('/tmp/user.xlsx');

wstream.write(data.Body , function(err){

if(err){

console.log(err);

} else{

exceltojson({

input: '/tmp/user.xlsx',

output: null,

},function(err, rest) {

if(err) {

result.message = 'error while reading the'+key+ ' file from '+bucketName;

} else{

maxCount = rest.length;

console.log('max count f rows in excel/cvs file = ' + maxCount);

ExcelData = rest;
}

}

}

b)   Upload the Zip file containing NodeJS Code, which holds logic for reading the excel file and pushing each row data into the concerned database by calling custom API.

c)   Map AWS Lambda Trigger to AWS S3 when putObject() method invoked inside AWS S3.

 

ii) To send the exception email

a)   Create lambda function using below code:

var ses = new aws.SES({

apiVersion: '2010-12-01'

});

// Generating params to send Email

var params = {

Destination: {},

Message: {

Subject: {

Data:Subject,

Charset: 'UTF-8'

},

Body: {

Html: {

Data: message,

Charset: 'UTF-8'

}

}

}

};

params.Destination.ToAddresses = [emailTOAddress];

params.Source = FromAddress;

// calling send email function

ses.sendEmail(params, function (err, data) {

if (err) {//failure message

result.message = err, err.stack;

console.log(result);

context.fail(result);

} else {//Sucess

result.message = 'Email send successfully';

result.data = data;

console.log(result);

context.succeed(result);

}

});

b)  Map AWS Lambda Trigger to AWS SNS topic when any exception occurs in the first lambda function i.e. while saving each row of excel file into the database.

Implementation

Step 1: Upload the excel file directly into AWS S3 bucket manually or through AWS SDK

Step 2: If everything goes well, all the records from excel file will be saved into database. Else an email will go to the configured (admin) email id with exception details

Conclusion

Based on the configuration and given excel file, records will be saved into the database and if any exception occurs then it will go as an email to the admin (configured email).

 

Solutions Infini SMS gateway integration using Node.js, AWS lambda & API Gateway

AWS lambda

AWS Lambda runs your code on a high-availability compute infrastructure and performs all of the compute resource management including the operating system and server maintenance, automatic scaling and code monitoring, capacity provisioning, etc., which makes it ideal for sending messages.

Node.js Support

The AWS SDK for Node.js enables developers to build libraries and applications that use AWS services. You can use the JavaScript API in the browser and inside Node.js applications on the server.

API Gateway

Amazon API Gateway manages all the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls, including authorization and access control, traffic management,  monitoring, and API version management.

Requirement

Sending an SMS based on the request parameters through API call.

Prerequisites

  • AWS lambda function
  • API Gateway
  • Node.js

Solution

API Gateway

The following Steps are required for implementing the above flow chart:

Step 1: Create a Lambda function using the below code.

var globalTunnel = require('global-tunnel');

var aws = require("aws-sdk");

var solutionInfiApiUrl = “http://alerts.solutionsinfini.com/api/v3/index.php?method=sms.json&api_key=” + <APIKey>;

var request = require('request');

//initializing Json request

              var Json = {};

              Json.sender = <Sender Number>;

              Json.message = <Message>;

              Json.format = 'json';

              Json.flash = 0;

              Json.sms = [];

              var receiver = {}

              receiver.to =<receiver Number>;

              Json.sms.push(receiver);

                var body = JSON.stringify(Json);                     

              // making POST request to send SMS

              request.post({

                       url:  solutionInfiApiUrl,

                       body:   body

                  }, function(error, response){

                 if (error) {

                     result.message = 'Send SMS faild'

                       result.error = error ;

                             context.fail (result);

}

                else{

                   result.message = 'SMS sent Successfully to ' + phoneMumber;

                   console.log('response' +  JSON.stringify(response));

                     context.succeed(result);

                 }

           }); 

Step 2: Create an API Gateway and add mapping of above lambda function.

Step 3: Test the API through Postman (Chrome App) by passing the below request parameters.

Request Format Type:  JSON 

      Method:  POST.

{

 "PhoneNumber": "<ReceiverMobileNumber>",

 "ProjectName": "<ProjectName>",

 "PropertyName": "<PropertyName>"

}

Conclusion

Using solutions inifini, we can create multiple Dynamic templates as well as a static template. Hence, we can send both dynamic and static SMS through AWS for the given phone number.

 

Elastic Search Query to Retrieve Records from Elastic Server

Elastic Search is an open-source search tool that is built on Lucene but natively it is JSON + RESTful. Elastic Search provides a JSON-style domain-specific language which can be used to execute queries, and is referred as the Query-DSL.

The search API allows us to execute a search query and get back search hits that match the query. Elastic search will fetch the records at lightning speed because of schema-less table structure. The query can either be provided using a simple query string as a parameter or using a request body.

Here I am showing how to write queries for Elastic search with some good set of standard queries as an example.

 

Basic Queries Using Only the Query String

Basic queries can be done using only query string parameters in the URL. For example, the following searches for the text ‘test’ in any field in any document and return at most 5 results:
{ElasticURL}/_search?size=5&q=test

 

Full Query API

Full Queries are powerful and complex ones which include queries that involve in faceting and statistical operations and should use the full elastic search query language and API. The queries are written as JSON structure in the query language and sent to the query endpoint (query language details are given below). There are two options to send a query to the search endpoint:

1. Either as the value of a source query parameter e.g. :

{ElasticURL}/_search?source={Query as JSON}

2. Or in the request body, e.g.,

{
     "query" : {
         "term" : { "PropertyName": "test" }
     }
 }

From & Size in Query

Pagination of results can be done by using the ‘from’ and ‘size’ parameters. The ‘from’ parameter defines the offset from the first result we want to fetch. The ‘size’ parameter allows us to configure the maximum amount of hits to be returned.

{
     "size" : 10,
     "from" : 0,
     "query" : {}
 }

Sample response from Elastic server

{
  "took": 7,
  "timed_out": false,
  "_shards": {
    "total": 5,
    "successful": 5,
    "failed": 0
  },
  "hits": {
    "total": 4,
    "max_score": 4.5618434,
    "hits": [
      {
        "_index": "ph_property",
        "_type": "property",
        "_id": "10322",
        "_score": 4.5618434,
        "_source": {
          "PropertyID": 10322,
          "PropertyCode": "VTELD21NGXKK3V02GJRLPRROB",
          "BuilderCode": "BY67DP",
          "BuilderName": "Janet Spencer",
          "PropertyName": "AWS test",
          "BHK": "",
          "PropertyTypeCode": "DO20ET",
          "PropertyType": "Residential Land"
        }
      }
    ]
  }
}

Query DSL Examples

1. Match all/Find Everything

{
     "query" : {
         "match_all" : { }
     }
 }

2. Filter on one field

{
     "query" : {
         "term" : { field-name: value }
     }
 }

3. Match with a field

{
  "query": {
    "bool": {
      "must": [
        {
          "match": {
            "field": "value"
          }
        }
      ]
    }
  }
}

4. Multi-match query builds on the match query to allow multi-field queries

{
  "multi_match" : {
    "query":    "this is a test",
    "fields": [ "subject", "message" ]
  }
}

5. Find documents which consist the exact term specified in the field specified

{
  "query": {
    "bool": {
      "should": [
        {
          "term": {
            "status": {
              "value": "urgent"
            }
          }
        },
        {
          "term": {
            "status": "normal"
          }
        }
      ]
    }
  }
}

6. Find documents, where the field specified consist values (strings, numbers, or dates) in the range specified

{
  "size": "9",
  "query": {
    "bool": {
      "must": [
        {
          "range": {
            "BudgetFrom": {
              "gte": 50000
            }
          }
        },
        {
          "range": {
            "BudgetTo": {
              "lte": 2231346
            }
          }
        }
      ]
    }
  }
}

7. The filtered query is used to combine a query that is used for scoring with another query that is used for filtering the result set.

{
  "filtered": {
    "query": {
      "match": { "tweet": "full text search" }
    },
    "filter": {
      "range": { "created": { "gte": "now-1d/d" }}
    }
  }
}

8. Filters documents that only have the provided ids.

{
    "ids" : {
        "type" : "my_type",
        "values" : ["1", "4", "100"]
    }
}

9. Filters documents that are matching the provided document/mapping type.

{
    "type" : {
        "value" : "my_type"
    }
}

10. Filter on two fields

{
    "query": {
        "filtered": {
            "query": {
                "match_all": {}
            },
            "filter": {
                "and": [
                    {
                        "range" : {
                            "b" : { 
                                "from" : 4, 
                                "to" : "8"
                            }
                        },
                    },
                    {
                        "term": {
                            "a": "john"
                        }
                    }
                ]
            }
        }
    }
}

An actual example with some search parameter:

{
  "from": "0",
  "size": "9",
  "query": {
    "bool": {
      "must": [
        {
          "match": {
            "ProjectTypeCode": "EQ92JK"
          }
        },
        {
          "match": {
            "PropertyStatusCode": "AS82IZ"
          }
        },
        {
          "match": {
            "PropertyTypeCode": "SJ85GF"
          }
        },
        {
          "match": {
            "MicroMarketCode": "DX60DL"
          }
        },
        {
          "range": {
            "SizeFrom": {
              "gte": 1000
            }
          }
        },
        {
          "range": {
            "BudgetFrom": {
              "gte": 50000
            }
          }
        },
        {
          "range": {
            "BudgetTo": {
              "lte": 2231346
            }
          }
        },
        {
          "range": {
            "PossessionDate": {
              "gte": "2016-01-01"
            }
          }
        },
        {
          "match": {
            "City": "Bengaluru"
          }
        }
      ]
    }
  }
}

Conclusion

These are the few frequently used queries to retrieve the data from Elastic server. However Elastic Search response can be retrieved using various other queries like Geo queries, joining queries, compound queries, specialized queries, etc. We can even join multiple queries to get the hits from elastic server. The response we get from Elastic Server is very fast compared to MySQL/Sql Server queries and hence it is now being used widely.

Source Code Generation using Razor Template Engine for Both Client Side and Server Side

Automating Source code generation can be surprisingly easy and will reap major benefits. It will help you develop 90% of the API’s for any project in just a button click. The primary benefits of automating source code generation are shown below in the form of a Bar chart.

Razor Template Engine

Explanation

Creating dynamic source code (Controller, facade, and Dao including interfaces) using code templates and razor template engine.

Steps to be followed to generate controller code are as follows

1. Create a controller template and keep only the common operations. Rename the file with extension as .txt.

2. Inside the file, where ever the Entity name is there replace it with razor template code (@Model.Entity). Assume Entity is your “table name” which will be given as  input parameter in reality.

3. Create a model object of dynamic type (refer email sending template in our solution)

4. Then bind data in the template like

var result = Utilities.BindDataInTemplate(template, “reg-email”, input.User as Object); Here BindDataInTemplate functions which is written in utilities file, binds the data in the template.

5. Save the result string as “.CS” Controller.cs.

The above similar steps are followed to generate other source code (facade and Dao including interfaces).

Details of API to generate server side dynamic source code 

API URL : http://localhost:53154/api/controller/create

Method Type : POST

Request :

{

"Entity": "Table name",

"Author": "User name";

}

 

Details of API to generate client side dynamic source code 

(simple html,route.js,module.js,controller.js,service.js)

API URL : http://localhost:53154/api/controller/create/js

Method Type : POST

Request :

{

"Entity": "Table name",

"Author": "User name",

"ControllerDescription" : "Description for Controller.js"

"ServiceDescription"    : "Description for Service.js"

"FilePath"              : "FilePath (Example: "E: Angulartemplate")";

}

  

Reference : Please refer trick#2 in the below link for using code templates and razor template engine

http://odetocode.com/blogs/scott/archive/2013/01/09/ten-tricks-for-razor-views.aspx.

Conclusion

Reliability, availability, productivity, performance, and cost reduction are powerful arguments for adopting automation solution. It helps you to create over 90% of the API’s that are required for an application. However, the tool generates code for API’s with common operations i.e. Save/Update, GetById, GetAll, Search, Search with Pagination and Delete. The API’s which are specific to the application should be taken care.

 

Automating Deployment of AWS Lambda

AWS Lambda is a compute service that lets you run code without provisioning or managing servers. AWS Lambda executes your code only when needed and scales automatically, from a few requests per day to thousands per second.

Deploying Codes into AWS Lambda through AWS Web Console is insecure and time taking, since every time user needs to login into AWS Lambda console and then they have to upload the zip file or need to provide AWS S3 file path manually. To overcome this issue, there is a very simpler way to automate the deployment process as shown in the below flow chart.

Automating-Deployment-of-AWS-Lambda

Setup and Configuration:

Step 1: Create an AWS Lambda function and deploy the below zip file into it,

https://github.com/avinashl3175/Vm_BlogContent/blob/master/Deploying_Lambda.zip

Step 2: Enable Lambda trigger with AWS S3 bucket when putObject method is invoked. Both S3 and Lambda Function should be in same region.

Step 3: Enable Versioning inside AWS S3 Bucket.

Step 4: Create a config.json file using any of the below configurations,

a) For Deployment into a new Lambda Function:

{

"accessKeyId" : "< AccessKeyId >",

"secretAccessKey": < SecretAccessKey >",

"region": "<Region>",

"lambdaFunctionName" : "<LamdaFunctionName>",

"lambdaFunctionType":  "new",

"lambdaHandler":"index.handler",

"lambdaRole":"<ARN Name>",

"lambdaRuntime":"nodejs4.3",

"lambdaDescription":"<Description>"

}

b) For Deployment into an old Lambda Function:

{

"accessKeyId" : "< AccessKeyId >",

"secretAccessKey": < SecretAccessKey >",

"region": "<Region>",

"lambdaFunctionName" : "<LamdaFunctionName>",

"lambdaFunctionType":  "old"

}

Step 5: Make a zip file containing following file(s)

a) Lambda code written in nodejs (index.js)

b) Node Modules folder (node_modules)

c) Other Relevant files

 

Deploying into AWS Lambda:

You can deploy the lambda code into any of the existing Lambda function or a new Lambda function anytime.

Steps needs to follow

Step 1: Push config.json file into AWS S3 bucket where trigger event mapping is done.

Step 2: Make the visibility of config.json file as public.

Step 3: Push <Lambda>.zip file into AWS S3.

Step 4: Deployment of lambda function will be done according to your config.json file. To verify it, go to the AWS Cloud Watch Console and go through the logs.

Note:  Copying files into AWS S3 can be done in two ways,

a) Copying file by login into AWS S3 Web Console.

b) By invoking putObject() method by using AWS SDK (Platform Independent).

Conclusion:

The Zip file pushed into AWS S3 will be deployed into AWS lambda according to the configuration file (config.json). The automation of lambda function deployment can be configured for any AWS account.

Secure Your Smartphone with Android Device Manager

Smartphones have become an intrinsic part of our daily life, and without them, we feel like only half a person. We spend hundreds of dollars to buy a smartphone with high-end features and store or share our personal data with friends of family members. But, unfortunately, what if your phone is stolen or misplaced? How to locate your phone or at least wipe the sensitive data?

Don’t worry; Android has a solution for you!

Android has a great native tool, Android Device Manager which helps to locate your device and remotely wipe all the data from the phone. However, you need to set up Google account to your phone and can use any online device to track it down or to wipe the data. The important thing, though, is that you need to have it set up and ready.

If your Android phone or tablet device is lost, misplaced, or stolen, you could use Android Device Manager (ADM) to:

  • Find the device: It shows your device’s location.
  • Ring, lock, and erase all data from the device: It helps to remotely ring or lock your device, erase everything on it (Device no longer can be tracked), or to add a phone number to the lock screen.

Set-Up

Step 1: Switch ADM ON or OFF

Login to your Google account and make sure ADM is turned on. If you have a device with more than one user, only the device owner can control this setting.

  1. Open device’s Settings app in your android phone.
  2. Tap Security –> Device administrators and activate the Android Device Manager.
  3. In the Settings screen, tap Google –> Security. Under “Android Device Manager,” activate Remotely locate this device and Allow remote lock and erase. (Android versions below 6.0 can activate these options from Google Settings app )

Note: If you want to use ADM, make sure you login to your Google Account on your device

Android Device ManagerAndroid Device Manager Setup

Step 2: Make sure the Location access is ON

To use ADM, you need to have location access turned ON. If your Location access is turned OFF, here’s how to turn it ON:

  1. Open device’s Settings app in your android phone.
  2. Scroll down and tap Location settings.
  3. At the top, tap on switch to ON.

Android Device Manager Location access

Step 3: Check whether Android Device Manager can find your device

After turn ON ADM, it is time to check whether it is working properly or not. Sign into your Google Account on android.com/devicemanager and check whether your device shows up. If you have more than one device signed into same Google account, you will probably see a map with your current location, and the model of your phone listed below. You can also see when you located it last, and where it is on the map. If you have multiple devices and have set up ADM on them, you will find them in the drop-down list at the bottom.

Android Device Manager

Locate, Ring, Lock and Erase your misplaced android device over the Internet

Have you ever misplaced your phone in between the couch cushions or left it in a restaurant? ADM allows you quickly ring your phone at maximum volume to help you find it even if it’s been in silence mode.

Android Device Manager ADM

You can also add a screen lock to your device through the Android Device Manager over the internet. Whether it is a phone or a tablet, screen lock is an important security step to safeguard the information stored on your device if it accidentally gets into the wrong people hands. Being able to do this through the ADM means that even after you lose your phone, you will still be able to add a lock (if there is no lock) to your device. The Lock option will allow setting or changing a PIN or password on your device, as well as display a message on the locked screen (if any recovery message is given) and also you can provide the alternate phone number to call. This is useful if you think someone else may come across your phone and you never set up a password.

Android Device Manager  Android Device Manager Lock Screen

Wiping a phone’s data that has gone forever. Be aware that everything is gone when and if you do this, and your phone will be back to the same settings as it was in the brand new phone in the box. This means that the ADM app, associated Google account will be deactivated and you will no longer be able to track the device. This will be helpful if your phone can not be recovered or has been stolen to erase all of the data quickly.

Android Device Manager Erase data

Conclusion

Android Device Manager is the best way to find your lost or stolen android device. Let’s take advantage of this tool. Google had developed this amazing tool but will be helpful only when the device is connected to the internet connection. Hopefully, Google may soon release its offline feature to find the lost devices.

WebComponent Vs Angular Formly – Issues in Displaying Forms in Firefox and Safari Browsers

I would like to share a quite interesting stuff with you pertaining to Angular Formly. We at Vmoksha Technologies have set our goal to write optimized code lines with quality. To attain our goal, we as a team, lean towards using new techniques that are contemporary to today’s software development.

One such framework is Angular.js, a quite interesting framework, which we have used in our recent web projects. As an extension to it, we went ahead exploring the feature that lets you generate the HTML forms – Angular-Formly.

 

Smooth Sail, until you hit rock bottom!

Yes. During the testing phase with multiple browsers and their versions, our quality control team figured out certain browsers such as FIREFOX and SAFARI, which won’t display the HTML forms generated by Angular-Formly. We could have just ignored this and continued with our efforts putting up a disclaimer stating that the project is supported by certain browsers and versions.

But, that doesn’t support the objective of the project. In Vmoksha, “Failure is not an option” (NASA’s mission statement) is the mantra, and we wanted to get this resolved. Questions raised, discussions held to figure out the root cause of the issue. As always, we approached Guru Google as well as developers in multiple forums, for a solution. But, Lady Luck didn’t turn her face towards us for more than a week.

 

Whether the culprit is Angular-Formly or Browsers in scope?

With no potential hint about the root cause as well as a solution, we could think of one possibility, and that is, “Resolve it yourself”;

“EUREKA!!! WE DID IT”

Yes, we found it, fixed it and tested it. FIREFOX, SAFARI – OK.

I could hear your mind; tell us the secret, buddy. I know you are thinking about -

What was the root cause? How did you resolve the issue?

Read on, as I keep explaining the data points behind using Angular-Formly, identifying the root cause and providing the fix.

 

Why we used Angular-Formly?

Even an expert HTML developer would agree that the process of repeating the same block of html code is really frustrating and non-contemporary.

“Angular-Formly is a JavaScript Powered forms for AngularJS; it lets you generate html forms automatically without much of an effort.”

Angular-Formly does just that, it reduces the effort of writing HTML forms and delivers it the way we want it. Customising Formly might seem difficult, but once you achieve it, you can reuse as long as you wish to use it. It does take few parameters and draws the HTML form for you on screen.

 

How did we resolve the browser issue?

The below images depict the scenarios of our implementation of Angular-Formly forms in different browsers:

In Chrome:

Angular formly

In Firefox and Safari:

Angular formly

Our approach to resolving the issue kick-started with the following questions:

  1. Is it the CSS that we are using in the project?
  2. Could it be a problem with the version of Angular-Formly used in the project?
  3. Maybe Angular-Formly doesn’t support FireFox and Safari. Did we check it?
  4. An overlap of a JS or CSS is possible. Who knows?

The last question ignited a thought in our minds. Eventually, we nailed down the root cause of the issue as we kept analysing each JS file referred to the project. We found something striking – WebComponents.js and ran the project excluding the component. To our astonishment, the Angular forms displayed seamlessly in all browsers including FIREFOX and SAFARI. So, we extended our research on the use of the component, its source, and impact.

 

Root Cause of the issue

In our project, we have a placeholder to show maps, and for that very reason, we had Google-Map bower component installed with a list of dependencies.

 
  "dependencies":{    
            "polymer":"Polymer/polymer#^1.2.3",  
            "google-apis":"GoogleWebComponents/google-apis#^1.1.1", 
            "iron-resizable-behavior":"PolymerElements/iron-resizable-behavior#^1.0.0",
	    "iron-selector":"PolymerElements/iron-selector#^1.0.5"   
	      }

Subsequently, the Polymer dependency bower components got installed, and one such dependency file is the “WebComponents.js” [an optional dependency item]

 "repository":{          
  	"type": "git",              
        "url": "https://github.com/Polymer/polymer.git"   
  	      },      
 "dependencies":{   
  	"webcomponentsjs":"^0.7.20"           
  		 },           
 "devDependencies":{     
        "web-component-tester": "*",                         
        "iron-component-page": "polymerElements/iron-component-page#^1.1.6"          
            }

 

About Webcomponent.js

WebComponents.js is a set of polyfills that is built on top of the Web Components specifications. Web components assist you in creating your own custom HTML elements. Instead of loading your sites with verbose mark-ups, repetitive code, and long scripts you wrap up everything into neat little custom HTML elements.

 

Final Fix

Note: The WebComponents.js polyfill layer is no longer needed for browsers that fully implement the Web
Components APIs, such as Chrome 36+

So, we excluded the WebComponent.js script from the project. Since then the Angular-Formly form is working seamlessly in all the modern browsers.

Hope, this write up helped you to learn something out of our experience and also, to resolve the issue.

Thanks, for reading our blog. Watch this space as we continue our journey in building robust applications with an objective and quality.

Virtual Hosting Using Nginx Server

Nginx is a web server, which can also act as a reverse proxy, HTTP server, IMAP/POP3 proxy server as well as a load balancer. It is well known for its stability, high performance, simple configuration, rich feature set, and low resource consumption. So, we can deploy our web applications like HTML pages and PHP-related applications into this server directly.

Let’s see How to Configure the Nginx as a Reverse Proxy/Virtual Hosting Purpose

#1. Deploy the nginx application in any server (I am taking Ubuntu System).

#2. Choose any domain/sub-domain name, and do the C-name configuring that domain name to nginx server (Ubuntu System Port 80).

Note: Port 80 is the default port for nginx. If you change the port, you need to map the C-name according to that.

#3. Once C-name and nginx applications are ready, create a conf.d folder inside the nginx.

#4. Create a configuration file with the name of domain/sub-domain along with the .conf extension.

For example, if you want the application should work on ‘abc.mycompany.com,’ you have to create a configuration file with the name of ‘abc.mycompany.com.conf,’ and copy the below-given code and save the file.

   server {

      listen 80;

      server_name abc.mycompany.com;

   location / {

      proxy_pass http://10.10.10.10:portnumber/;

      proxy_http_version 1.1;

      proxy_read_timeout 300000;

      proxy_set_header Upgrade $http_upgrade;

      proxy_set_header Connection 'upgrade';

      proxy_set_header Host $host;

      proxy_cache_bypass $http_upgrade;

     }

   }

#5. Restart/reload the Nginx.

Now your application will work with the domain name based on your configuration.

 

Glossary

Listen – Nginx port listener

Server_name – Domain name

Proxy_pass – Actual running application URL (domain name indirectly calls this URL)

Proxy_read_timeout – For long connection establishment (optional)

Nginx default connection timeout – 600 m.sec

Setting up a Secure Email Engine using Amazon SES

Cloud computing, also known as on-the-line computing, is a kind of Internet-based computing that provides shared processing resources and data to computers and other devices on demand. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources (e.g., networks, storage, applications, servers, and services), which can be rapidly provisioned and released with minimal management effort. Cloud computing and storage solutions provide enterprises and users with various capabilities to store and process their data in third-party data centers. It relies on sharing of resources to achieve coherence and economy of scale, similar to a utility (like the electricity grid) over a network.

Amazon Web Services (AWS), a subsidiary of Amazon.com, which offers a suite of cloud computing services that make up an on-demand cloud computing platform.  The scope of this blog is confined to one of the efficient and effective services which are a part of AWS – Amazon SES.

Amazon SES is a pay-per-use email distribution engine that provides AWS users with an easy, authentic, cost-effective, reliable and consistent infrastructure for sending and receiving bulk email correspondence using your domain and email addresses. 

amazon web services

Why Vmoksha opts for Amazon SES?

Amazon SES works with Elastic Compute Cloud also known as “EC2,” Lambda, Elastic Beanstalk and various other services. It is available in different regions such as US-East, US-West, and EU-Ireland, which allow consumers close to these regions to deploy their applications to ensure high availability and low latency.

Unlike other SMTP players in the market, Amazon SES provides competitive pricing and deliverability.

Listed below are certain benefits of using Amazon SES:

  1. Trusted by Internet Service Providers (ISP) as an authentic source
  2. Cost-Effective & Competitive Pay-per-use pricing
  3. Reliability and Scalability
  4. Bulk Messaging Engine
  5. Automation using Amazon Lambda functions
  6. Ensure deliverability and Active monitoring to make sure that the illegal or questionable content is not being distributed
  7. No Infrastructure challenges
  8. Provides mailbox simulator application as a testing environment
  9. Real-time notifications via Amazon SNS.

How Vmoksha make use of Amazon SES?

The Amazon SES service along with Amazon Lambda service is configured for sending emails automatically. The mail sent via SES is verified by ISP and mail service provider such as Google and finally delivered to the employee(s). To ensure the smooth delivery of the mail, Vmoksha undergoes certain workarounds, which are described in the following sections.

The following diagram explains the scenario

Amazon SES

Amazon Simple Email Services

Setting up Amazon Simple Email Service (SES):

First, set-up Amazon Web Services (AWS) account to use this service

After signing up to the AWS account, log-in into the management console and look for SES under services section or log-in with the URL, http://aws.amazon.com/ses

 

Steps to verify Email Addresses and Domain:

   I.  Steps to Configure Amazon SES

Goto SES home page, navigate to Identity management menu and choose your option to verify either your email domain or list of addresses.

For example;

Email addresses – sales@abc.com, finance@abc.com and so on…

Domain – abc.com

The verification is managed using the Amazon SES console or Amazon SES API.

Note: Email address and domain verification status for each AWS region is separate.

Although, Email Addresses verification is quite an easy step, completed by opening the verification URL sent by SES. Domain verification demands the following steps,

    1. Go to Domains under Identity Management, select Verify a New Domain.
    2. Enter the domain name and select Generate DKIM settings and Click Verify This Domain.
    3. List of DNS record details will be displayed, which needs to be added in the DNS Zone Files of your domain. Eg. Godaddy DNS management
    4. Download the csv file of DNS Records. This contains the details of Text (TXT), Canonical Name (CNAME), and Mail Exchange (MX) records that need to be added or amended in DNS records.
    5. Domain verification can be done by just adding a text (TXT) record in your DNS Zone File. But, it is highly recommended to perform DKIM verification.
    6. TXT Records looks similar to this,

 

_amazonses.abc.com         TXT     pmBGN/7MjnfhTKUZ06Enqq1PeGUaOkw8lGhcfwefcHU=

 

  1. On propagating TXT record in domain, the domain verification status changes to verified
  2. To ensure that the mail is from a trusted source, DKIM verification is required. DKIM verification can be done by adding CNAME records in DNS Control Panel.
  3. Once DNS changes are reflected, the domain is fully verified.

Email Authentication via SPF or DKIM:

Amazon SES uses Simple Mail Transfer Protocol (SMTP) to send an email. Since SMTP does not provide authentication by itself, spammers can send messages pretending to be from the actual sender or domain. Most of the ISPs evaluate the email traffic to check if the emails are legitimate.

 

Authentication Mechanisms:

There are two authentication mechanisms used by ISPs commonly:

  1. Email Authentication with SPF (Sender Policy Framework)
  2. Email Authentication with DKIM (DomainKeys Identified Mail)

 

Email Authentication with SPF:

Setting up SPF Records and Generating SMTP credentials:

A Sender Policy Framework (SPF) Record indicates to ISPs that you have authorized Amazon SES to send mail for your domain. SPF Record looks similar to this,

abc.com       SPF           “v=spf1 include:amazonses.com -all”

 

SMTP Credentials can be generated from SES management console under Email Sending section. It prompts to create an IAM user and provides SMTP username and password upon creation of that IAM user. Another alternative way is to create a separate IAM user with access to SES service using access key and secret key as SMTP credentials.

Note:

If SPF Record already exists, then, you can append “include:amazonses.com” to the existing record. Also to work with Google apps, you need to add “include:_spf.google.com ~all”

If SPF record does not exist in the DNS Zone File, text (TXT) record can be added with the value as “v=spf1 include:amazonses.com -all.”

 

Email Authentication with DKIM:

DKIM (DomainKeys Identified Mail) is a standard that allows senders to sign their email messages & ISPs and use those signatures to verify whether that messages are legitimate and cannot be modified by a third party in transit. DKIM setup can be done by adding CNAME records provided by Amazon SES in DNS Zone File.

Here are the samples of CNAME records for DKIM Verification,

mvkw7orpsecw2._domainkey.abc.com  CNAME  mvkw7orpsecw2.dkim.amazonses.com
jp5x3nni3zf4uo6._domainkey.abc.com CNAME  jp5x3nni3zf4uo6.dkim.amazonses.com
7i3j33udxinbhjf6._domainkey.abc.com  CNAME 7i3j33udxinbhjf6.dkim.amazonses.com

 

Finally, now it’s time to leave all SMTP servers and move on to AWS Simple Email Service (SES). This way Amazon Web Services reduces the effort of DevOps and takes IT Revolution to the next level.

Useful Links: