Saturday, November 20, 2021

What is PaaS?

 Platform as a service (PaaS) is a complete development and deployment environment in the cloud, with resources that enable you to deliver everything from simple cloud-based apps to sophisticated, cloud-enabled enterprise applications. You purchase the resources you need from a cloud service provider on a pay-as-you-go basis and access them over a secure Internet connection.

Like IaaS, PaaS includes infrastructure—servers, storage and networking—but also middleware, development tools, business intelligence (BI) services, database management systems and more. PaaS is designed to support the complete web application lifecycle: building, testing, deploying, managing and updating.

PaaS allows you to avoid the expense and complexity of buying and managing software licenses, the underlying application infrastructure and middleware, container orchestrators such as Kubernetes or the development tools and other resources. You manage the applications and services you develop and the cloud service provider typically manages everything else.

Platform as a Service — IaaS includes servers and storage, networking firewalls and security and datacenter (physical plant/building). PaaS includes IaaS elements plus operating systems, development tools, database management and business analytics. SaaS includes PaaS elements plus hosted apps.
Hosted applications/appsDevelopment tools, database management, business analyticsOperating systemsServers and storageNetworking firewalls/securityData center physical plant/building

Common PaaS scenarios

Organisations typically use PaaS for these scenarios:

Development framework. PaaS provides a framework that developers can build upon to develop or customise cloud-based applications. Similar to the way you create an Excel macro, PaaS lets developers create applications using built-in software components. Cloud features such as scalability, high-availability and multi-tenant capability are included, reducing the amount of coding that developers must do.

Analytics or business intelligence. Tools provided as a service with PaaS allow organisations to analyse and mine their data, finding insights and patterns and predicting outcomes to improve forecasting, product design decisions, investment returns and other business decisions.

Additional services. PaaS providers may offer other services that enhance applications, such as workflow, directory, security and scheduling.

Advantages of PaaS

By delivering infrastructure as a service, PaaS offers the same advantages as IaaS. But its additional features—middleware, development tools and other business tools—give you more advantages:

Cut coding time. PaaS development tools can cut the time it takes to code new apps with pre-coded application components built into the platform, such as workflow, directory services, security features, search and so on.

Add development capabilities without adding staff. Platform as a Service components can give your development team new capabilities without your needing to add staff having the required skills.

Develop for multiple platforms—including mobile—more easily. Some service providers give you development options for multiple platforms, such as computers, mobile devices and browsers making cross-platform apps quicker and easier to develop.

Use sophisticated tools affordably. A pay-as-you-go model makes it possible for individuals or organisations to use sophisticated development software and business intelligence and analytics tools that they could not afford to purchase outright.

Support geographically distributed development teams. Because the development environment is accessed over the Internet, development teams can work together on projects even when team members are in remote locations.

Efficiently manage the application lifecycle. PaaS provides all of the capabilities that you need to support the complete web application lifecycle: building, testing, deploying, managing and updating within the same integrated environment.

How to send an email with dynamic templates from SendGrid with ASP.NET Core?

 This article will discuss about sending emails with dynamic templates from SendGrid with ASP.NET Core. SendGrid will help you to configure and send emails from your apps using SMTP API and SendGrid API. SendGrid also helps you to design and configure email templates from their admin portal. 

In one of applications I am building I am using Azure B2C - which currently supports custom from email address with the help of SendGrid. You can find more details about how to configure it from here. So first you need to create dynamic template from SendGrid. You can create it from https://mc.sendgrid.com/dynamic-templates. You can create the template either using Code Editor or Design Editor.

SendGrid Dynamic Template

You can put the place holders inside double curly braces like this ``. Please note it is case-sensitive.

Once you create it, you need to note down the template id, which is required to use in the code. Next you need to create an API Key, which you can do it from API Keys section - https://app.sendgrid.com/settings/api_keys. You need to select only Mail Send access details only. You need this also in the code.

Once you completed these two steps, you can start writing code. To use SendGrid API, you need to add the SendGrid package - with this command - dotnet add package SendGrid. And here is the code.

var sendGridClient = new SendGridClient("YOUR_API_KEY");
var sendGridMessage = new SendGridMessage();
sendGridMessage.SetFrom("noreply@example.com", "Example");
sendGridMessage.AddTo("anuraj@example.com");
//The Template Id will be something like this - d-9416e4bc396e4e7fbb658900102abaa2
sendGridMessage.SetTemplateId("YOUR_TEMPLATE_ID");
//Here is the Place holder values you need to replace.
sendGridMessage.SetTemplateData(new
{
    name = "Anuraj",
    url = "https://dotnetthoughts.net"
});

var response = await sendGridClient.SendEmailAsync(sendGridMessage);
if (response.StatusCode == System.Net.HttpStatusCode.Accepted)
{
    //Mail sent
}

Happy Programming :)

What are IaaS, PaaS and SaaS?

 IaaS: This stands for “Infrastructure as a Service” which provides a set of capabilities like OS, network connectivities, etc which are at the infrastructural level and are delivered as pay per use policy. The infrastructure is used for hosting applications. Examples include Azure VM, VNET, etc.

PaaS: PaaS stands for “Platform as a Service” which is mostly about underlying infrastructure abstraction to the developers for enabling quicker development of the applications without the need for worry about hosting management. Examples include Azure web apps, Storage services, cloud services, etc.

SaaS: SaaS stands for “Software as a Service” and are those applications which are delivered using the service delivery model where the applications are simply consumed and used by an organization. These applications are generally mobilized by making the organization pay for their usage or through ads. Examples include applications like Office 365, Gmail, SharePoint Online, and so on.

The following table shows the difference between the On-Prem Service, IaaS, PaaS, and SaaS services. We can observe that as we go right, the level of control the developer or the user has over the application reduces.

Azure Paas and serverless services you don't want to miss as an ISV

We collaborate with Independent Software Vendors on a daily basis to improve their solutions with the help of Microsoft Azure Platform. During this journey we come across a wide range of Azure services that will add value to the customer solution. Azure is quite extensive but there are some services we come across often, have a proven track-record and add great value to the customer solution. 

So why Platform as a Service?
Platform as a Service within Azure comes with a wide range of benefits. For example: Microsoft provides well documented and extensive SDKs and lifecycle management tooling to support these services. Additionally, you will only manage the platform and security, monitoring and support are available by default and features such as automated scaling and back-up are available out-of-the-box.

From these services we managed to compile a top 5 list with must-have services that will greatly enhance your solution in terms of availability, scalability and costs.


Web Apps
Azure Web Apps are a part of the App Services portfolio that Azure provides. App services provide you with a platform to deploy Web Applications, API’s and Bots. With most of our ISV partners we started with modernizing the traditional platform such as IaaS configurations with IIS or Apache and deploy them as Azure Web Apps. With the right configuration and minor adjustments, transitioning to a Azure Web App requires minimal effort  and in most the benefits are amazing. Automated updates, no more operating system management, built-in security features, scalability and a great SLA on just a single instance of the Azure Web App.

Azure Web Apps can run almost anything from .Net to NodeJS applications and the available extensions allow for so much more.

As a sidenote we do advise the implementation of Application Insights when deploying to Web Apps. We experienced that Web Apps work best when programming language best practices are used. With Application Insights, bottlenecks can be identified during the testing fase (prior to production deployment) and combined with the detailed logging from the Web App management console (Kudu) we can exactly pinpoint which pieces of code require adjustment to further optimize the solution to run on Azure Web Apps.


Logic Apps
Logic Apps can be used to add complex workflows to automate your business logic. However, we also experienced that Logic Apps are great for the “low hanging fruit”. The stuff you want to automate but isn’t just yet can often be automated relatively easy with the use of Logic Apps as Microsoft has provided numerous default connectors for widely available SaaS solutions. If there is no default connector available for your solution, there is room for customization (in combination with Azure Functions if you will). With Logic Apps you can literally automate your workflow by clicking the right steps together and connect to the required resources. 


Azure Functions
Azure Functions are based on App Service and have a single purpose: Execute on demand. With off the shelf HTTP Triggers and endpoints (REST API), all you need to focus on is the development of the processing you want to take place. For example: automatically convert Word documents to PDF files as soon as files are stored by the system or query the cognitive services API to automatically run text through the sentiment recognition. You can pretty much build anything you want on Azure Functions as long as you keep in mind that these are short lived. The function will eventually shutdown until it is being triggered again. You will only be billed based on consumption. If there is nothing to be processed; no costs.

Even though Azure Functions are generally categorized as being a part of the Serverless proposition (just as Logic Apps). They are very similar to Web Apps as they run on the same Platform (Azure App Services). The key difference here is that Azure Functions are solely focused on performing or executing a specific set of code and you can truly focus on development, whereas Web Apps still require management of the actual platform (which you probably want to adjust the settings so they fit your solution perfectly).

Azure SQL 
Azure SQL is the go-to database solution for relational databases. As most ISVs we’ve come across have a history of using relational databases, Azure SQL is a great solution that provides an out of the box database with numerous security features and options in terms of scalability. Especially when you have multiple customers who require multiple databases the Elastic Pools feature can greatly lower the financial impact.

Azure SQL has an extremely high availability by default but if required, geo-replication is also available to scale across multiple regions. Migrating to Azure SQL often requires minimal changes on the application side as only the connection string will change.

To further improve your database performance, Azure SQL comes with built-in intelligence that taps into the power of machine learning to analyse your database and provide you with valuable recommendations. Microsoft provides multiple strategies to migrate your database to Azure. You can choose to use the Database Migration Services, import the database yourself or use one of the many other options to copy your data to Azure SQL and get started.


Azure Service Bus
Add reliability to your message queueing, data stream and decouple your application with Azure Service Bus. In short: Azure Service Bus is a multi-tenant cloud message services provided by Microsoft and is great for asynchronous operations. Service Bus supports multiple protocols (AMQP, SBMP and HTTP) High availability and geographic scaling are enabled with literally a press of the button. The use of Azure Service Bus can be as simple as message queueing, but more complex topologies are definitely supported with the use of Topics and Subscriptions, which allow publish/subscribe scenario. Topics can have multiple, independent subscriptions. A subscriber to a topic can receive a copy of each message sent to that topic.

Service Bus also comes with a set of advanced features such as auto-forwarding, dead-lettering, auto-delete and duplicate detection. Features which no longer require a complex implementation of a traditional Enterprise Service Bus (ESB), because Azure provided the implementation for you. All you need to do is use it.

Thursday, November 18, 2021

Docker Interview Questions and Answers

 Docker Interview Questions and Answers

These questions are targeted for Docker (a set of platform as a service (PaaS) products). You must know the answer of these frequently asked Docker interview questions to clear a Job interview. Development of ASP.NET Web API and Spring Boot MicroServices with Docker is very popular so you can find these technical questions very helpful for your next interview.


1. What is Docker?

Docker is an open-source Platform that provides the capability to develop, test, ship and run your application in an isolated environment. This isolated environment is called container. Docker enables you to keep your application separate from your Infrastructure and it reduces the delay between developing your application and deploying it in production.
Docker runs the applications in isolation with security, it allows you to run many containers on a host machine (A machine where many virtual machines are created and run). Containers are light weighted as they directly run within the host machine's kernel rather than running on hypervisor (It's a system that creates and runs virtual machines). It means you can run more containers on this hardware than if you were using virtual machines. You can run docker containers on virtual machines as well.
Docker Certified Associate (DCA) exam is designed by Docker to validate the skills.

2. What is the Docker Engine?

Docker Engine is an application which has client-server type architecture with these major components.

  • Daemon Process ('dockerd' command) - It's a server which is a type of long running program.
  • A Rest API - It provides an interface that programs or CLI use to talk to daemon processes and to instruct it.
  • A CLI (Command Line Interface) client ('docker' command).
Docker Engine
Source: Docker Overview

3. What can you use Docker for?

You can use the docker for:

  • Fast and consistent delivery of applications- means dockers allows developers to streamline their work by allowing them to work on local containers which provide their applications and services. Containers are best suited for CI (Continuous Integration) and CD (Continuous Delivery)
  • Scaling and Responsive deployment - means docker's container based platform allows for highly portable workloads and docker containers can run on developers laptop, on physical or virtual machines, on cloud provider or in a hybrid environment.
  • Running more workloads on the same hardware - means docker provides a cost effective alternative of hypervisor based virtual machines so that you can consume more compute capacity without changing the existing hardware. Docker is very lightweight and fast.

4. Give some Docker Example scenarios.

As Docker provides the consistent and fast delivery of applications, these are the below examples for that.

  • Developers write the code and can share that with their team members using dockers containers.
  • Docker is used to push the code and execute automated or manual tests in a Test environment.
  • Developers can fix the bugs in the development environment and can push them to the test environment for testing.
  • Giving fixes or updated applications to customers is easy as you can push the updated image to the Production.

5. Explain the Docker architecture.

Docker is based on client-server architecture. Docker client talks to docker daemon (A long running program) via REST API over a network interface or UNIX sockets. Docker client and daemon can run on the same system or remote. Docker daemon is responsible for building, running and distributing the containers.
Docker Architecture
Source: Docker Architecture

  • Docker Daemon - responsible to listen to API calls and manage docker objects. To manage Docker services it can communicate with other docker daemon as well.
  • Docker Client - is used to communicate with docker daemon via REST APIs and can communicate with more than one daemon.
  • Docker Registries - Docker images are stored in Docker registry. Docker Hub is a public registry and a default location for docker images configured in docker.
  • Docker Objects - When you are using docker means you are going to use many things like images, containers, volume, networks and many others, these are called docker objects.

6. What is DTR (Docker Trusted Registry)?

If you are using Docker Data Center (DDC), then Docker provides an enterprise grade image storage solution called Docker Trusted Registry (DTR). DTR can be installed on virtual private networks or on-premises so that you can store your images in a secure way and behind your firewall. DTR also provides a User Interface that can be accessed by authorized users to view and manage the repositories.

7. What are the common Docker objects?

With Docker you use many things like Images, Containers, Registries, Services, Volumes etc. These all are Docker objects.

8. Explain the Docker Images.

Docker Image is a read-only template with instructions that forms the basis of a container. It's an order collection of file-system changes. Image is based on another image with some customizations. For example, you can create an image that is based on 'ubuntu' image but you can install other application dependencies.
Images contain a set of run parameters (starting executable file) that will run when the container starts.

9. Why Images are light weight, fast and small?

Docker allows you to create your own image using a docker file with simple instructions needed to create an image. Each instruction creates layers in docker image. So when you make any changes in a docker file then only changed layers are rebuilt, not all, this is the reason that docker images are fast, small and light weight in comparison to other virtualization systems.

10. Describe Docker Containers?

Containers are created from Images or you can say Container is a runnable instance of an image. When you build your image and deploy the application with all the dependencies then multiple containers can be instantiated. Each container is isolated from one another and from the host machine as well. So A Docker Container is defined by an image and other configurations provided when you start or create it.

11. Explain the underlying technology of Docker.

Docker is developed in Go Language and has capability to use many Linux kernel features to deliver the functionality. Docker also relies on many other below technologies as well.

  • Namespaces - Docker provides an isolated environment called container, that container is managed by namespaces. So Namespaces provide that isolation layer.
  • Control groups - Control groups (cgroups) allow docker engines to share hardware resources to the containers.
  • Union file systems - Building blocks for a container is managed by Union file system or UnionFS
  • Container format - Namespaces, Control groups and UnionFS are combined into a wrapper by Docker Engine that's called Container Format.

12. What is Docker Swarm? How is it different from SwamKit and Swarm Model?

Docker Swarm is a docker native container orchestration tool. It's an older tool which is standalone from docker engine. It allows users to manage containers deployed across multiple hosts. So still it's a good choice for multi-host orchestration but Docker Swarm Model is recommended for new docker projects.
For more about Docker Swarm vs SwarmKit visit Difference between Docker Swarm, Swarm Mode & SwarmKit

13. What is Docker SwarmKit?

Docker SwarmKit is a separate project which implements the docker orchestration system. It's a toolkit that orchestrates distributed systems at any scale. SwarmKit's main benefits include:
  • Distributed SwarmKit uses Raft Consensus Algorithm in order to co-ordinate and does not depend on a single point of failure to perform decisions.
  • Secure WarmKit uses the mutual TLS for role authorization, node authentication and transport encryption.
  • Simple It's simple to operate and does not need any external database to operate. It minimizes infrastructure dependencies.
For more visit Docker SwarmKit

14. What is Dockerfile?

Dockerfile is a file that holds a set of instructions to create an image. Each instruction of Dockerfile is responsible for creating a layer in the image. When you rebuild the image, only changed layers are rebuilt.

15. What are the services?

service is created to deploy an application image when the Docker engine is in swarm mode. A service is nothing here, just an image for a MicroService in the context of a larger application. Services examples might include databases, HTTP server or any executable program that you want to run in a distributed environment.
When creating a service you must specify the container image to use and commands to execute inside running containers. For more visit How services work in Docker.

16. Docker Swarm vs Kubernetes?

Docker Swarm and Kubernetes both are open-source container orchestration tools but come with some Pros and Cons. Docker Swarm Pros:
  • It's built for use with Docker Engine and has its own Swarm API.
  • Provide easy integration with tools such as Docker CLI and Docker Compose.
  • Docker Swarm offers easy installation and setup for Docker environments.
  • Offers various functionalities including - Internal load balancers, auto-scaling groups, swarm managers for availability control, etc.
Docker Swarm Cons:
  • It does not offer much customizations and extensions.
  • Provides lesser functionalities when compared with Kubernetes.
Kubernetes Pros:
  • A very active development in code base by the open-source community.
  • Well tested by Big organizations such as Google, IBM etc and works well on most operating systems.
  • Offers various functionalities including - Load balancing, horizontal scalability, storage orchestration, automated rollouts and rollbacks, service discovery , self healing, intelligent scheduling and batch execution, etc.
Kubernetes Cons:
  • Specialized knowledge is required to manage the Kubernetes Master.
  • The open-source community provides frequent updates, So special careful patching is required to avoid breaking changes.
  • Require additional tools such as kubectl CLI, services, CI/CD and other practices to full management.
For more visit Docker Swarm vs Kubernetes.

17. What is Containerization? And why is it being popular?

18. What are the docker client commands?

19. What are the docker registry commands?

20. What are the docker daemon commands?

21. Give an example of a simple Dockerfile.

22. What is the difference between RUN and CMD in Dockerfile?

23.Explain some quick facts about Docker.

Developers are considering Docker as a good choice to deploy the application any time, anywhere - OnPrem or Cloud. Let's know some facts about it.

  • Docker was launched by Solomon Hykes in 2013. Now Solomon Hykes is the CTO and chief Architect of Docker.
  • Docker allows us to build, ship, run and orchestrate the applications in an isolated environment.
  • Docker is much faster than starting a virtual machine, but virtual machines are not obsolete yet.
  • Docker Hub offers free options to host public repositories by developers and paid options for private repositories.
  • Docker Desktop and Docker Compose usage has reduced the local development environment setup time that helps the developers to be productive.

24. What is the future of Docker?

Docker offers a quick way to build, develop, ship and orchestrate distributed applications in isolated environments.
Docker is being used by many companies to make developer's processes faster. Docker also provides automatic deployment management. So most companies are adopting this containerized approach for their application development and deployment.
Docker also provides integration with many hundreds of tools like Bitbucket, Jenkins, Kubernetes, Ansible, Amazon EC2 etc. So there are a lot of jobs in the market for Docker skills and Docker Professionals are being paid very good salaries.

Some General Interview Questions for Docker

1. How much will you rate yourself in Docker?

When you attend an interview, Interviewer may ask you to rate yourself in a specific Technology like Docker, So It's depend on your knowledge and work experience in Docker.

2. What challenges did you face while working on Docker?

This question may be specific to your technology and completely depends on your past work experience. So you need to just explain the challenges you faced related to Docker in your Project.

3. What was your role in the last Project related to Docker?

It's based on your role and responsibilities assigned to you and what functionality you implemented using Docker in your project. This question is generally asked in every interview.

4. How much experience do you have in Docker?

Here you can tell about your overall work experience on Docker.

5. Have you done any Docker Certification or Training?

It depends on the candidate whether you have done any Docker training or certification. Certifications or training are not essential but good to have.

Sunday, November 7, 2021

Multiple Choice Questions - Angular2 Components

 1. A component is a class decorated with some metadata and . . . . . . are functions that can modify a class.


A) Selectors
B) Decorators
C) Mentors
D) Bindings

2. Consider the statement "Every component must be declared in one and only one Angular module". Is it true or false?

A) True
B) False

3. . . . . . . . . . . . is a life cycle hook called by Angular2 to indicate that Angular is done creating the component.

A) ngOnChanges
B) ngAfterViewInit
C) ngAfterContentInit
D) ngOnInit

4. Elements present inside the template of a component are called . . . . . . . children.

A) content
B) nested
C) view
D) template

5. Some of the core properties that we required when creating Angular 2 components are

A) Providers
B) Templates & selectors
C) Items
D) Directives

6. ngDoCheck()

A) It is used to override the default change detection algorithm for a directive.
B) It is invoked on every check of your component’s view.
C) It is invoked when the view is completely initialised.
D) It is invoked every time the directive’s content is checked.

7. . . . . . . . . . are used to invoke events on the parent component from child components in the hierarchy of components.

A) Inputs
B) Outputs
C) Blueprints
D) hooks

8. The scope of the component travels through the chain of components. However, components that are defined in the . . . . . . . . . array aren’t injected or inherited by the child components in the hierarchical chain.

A) services
B) singletons
C) parent
D) viewproviders

9. Components are basically directives with views but we cannot implement them by nesting/composing other directives and components.

A) True
B) False

10. Components conduct a provider lookup upwards because each component has its own built-in . . . . . . . , which is specific to it.

A) service
B) instance
C) hook
D) injector

Answers
1) b, 2) a, 3) d, 4) c, 5) a,b,c, 6) a, 7) b, 8) d, 9) b, 10) d

Multiple Choice Questions - Angular2 Events

 1. Event binding can be defined . . . . . . . . .


A) by wrapping the event in (parenthesis)
B) by prefixing it with in-
C) by wrapping the event in {curly brackets}
D) by prefixing it with on-

2. EventEmitter class acts both as an observer and observable.

A) True
B) False

3. Events in Angular 2 behave like normal DOM events. They can bubble up but cannot propagate down.

A) True
B) False

4. EventEmitter class’s simple interface, which basically encompass two methods . . . . . . . . . can therefore be used to trigger custom events and listen to events as well, both synchronously or asynchronously.

A) exit()
B) superscript()
C) subscribe()
D) emit()

5. Angular framework provides event binding using an in-built events as well as a custom event. Custom events are the EventEmitter instances. To create a custom event we need to create an instance of EventEmitter annotated by . . . . . . .

A) @Input()
B) @Get()
C) @Output()
D) @Set()

6. EventEmitter class is used by directives and components to emit custom Events.

A) True
B) False

7. @Output() myEvent = new EventEmitter();

A) Declares an output property that fires events that you cannot subscribe to with an event binding.
B) Declares an output property that fires events that you can subscribe to with an event binding.
C) Declares an output property that overrides events that you can subscribe to with an event binding.
D) Declares an output property that subscribes to events that you can subscribe to with an event binding.

8. . . . . . . . . need to be passed as a parameter in the event callback from the template to capture the event object.

A) $event.start
B) $events
C) $eventobj
D) $event

9. Calling . . . . . . . . on the event prevents propagation.

A) stopEventPropagation
B) preventEventPropagation
C) stopPropagation
D) preventPropagation

10. Events on child elements are propagated upwards, and hence event binding is also possible on a parent element.

A) True
B) False

Answers
1) a,d, 2) a, 3) b, 4) c,d, 5) c, 6) a, 7) b, 8) d, 9) c, 10) a

Get max value for identity column without a table scan

  You can use   IDENT_CURRENT   to look up the last identity value to be inserted, e.g. IDENT_CURRENT( 'MyTable' ) However, be caut...