Overview of Microsoft Graph

Overview of Microsoft Graph

Microsoft Graph is the gateway to data and intelligence in Microsoft 365. It provides a unified programmability model that you can use to access the tremendous amount of data in Microsoft 365, Windows, and Enterprise Mobility + Security.

Microsoft Graph exposes REST APIs and client libraries to access data on the following Microsoft cloud services:

-Microsoft 365 core services: Bookings, Excel, Microsoft 365 compliance eDiscovery, Microsoft Search, OneDrive, OneNote, Outlook/Exchange, People (Outlook contacts), Planner, SharePoint, Teams
-Enterprise Mobility + Security services: Advanced Threat Analytics, Advanced Threat Protection, Azure Active Directory, Identity Manager, and Intune
-Windows services: activities, devices, notifications, Universal Print
-Dynamics 365 Business Central services

What’s in Microsoft Graph API?
The Microsoft Graph API is a RESTful web API that enables you to access Microsoft Cloud service resources. After you register your app and get authentication tokens for a user or service, you can make requests to the Microsoft Graph API.

In the Microsoft 365 platform, three main components facilitate the access and flow of data:
1.Microsoft Graph API
2.Microsoft Graph connectors
3.Microsoft Graph Data Connect (Tools)

Popular API request
Operation URL
GET my profile https://graph.microsoft.com/v1.0/me
GET my files https://graph.microsoft.com/v1.0/me/drive/root/children

https://developer.microsoft.com/en-us/graph/graph-explorer

Courtesy : https://learn.microsoft.com/en-us/graph/overview

Robotic Process Automation(RPA)

Robotic Process Automation(RPA)

RPA is the technique of automating an actual business process to complete a task without the intervention of human beings. An RPA developer is a trained professional who has expertise in the field of software development, and not just any software. The basic work of an RPA developer is to deploys hardware-oriented software that deploys codes that are responsible for automating repetitive tasks.

Robotic – Robots are entities developed to complete tasks performed by human.
Process – Process is a sequence of tasks combined to perform a meaningful action.
Automation – Automation is done to perform tasks without human intervention.

RPA Tools :
Blue Prism
UiPath
Automation Anywhere

Life Cycle Of RPA Project :

Identify
Analyse
Design
Develop
Test
Implement

Roles & Responsibilities
-They identify and develop the most effective automated processes for any business organization.
-They aim to increase the efficiency of the workflow in any management.
-Proper monitoring is always required to access the output of the process that is being automated.
-The developers are also responsible for creating detailed outlines of all the processes and study the data to make sure that the required production is being met with the use of an automation system
-Quality assurance

Required skills:
1. Programming Skills
2. Scripting Languages like HTML and JavaScript
3. SQL
4. Knowledge of AI and ML concepts
5. Familiarity with Automation Tools
6. Required Analytical and Soft skills

Advantages Of RPA :

Improved Quality –
Reduced Time –
Cost Efficient –

Applications Of RPA :

Data migration and Data entry.
Data validation.
Extracting data from PDF and other scanned documents.
Regular report generation.
Creating and developing invoices.
Generating mass emails.
Updating CRM.
Automated testing.
Expense management.

Microservices Architecture

Microservices Architecture

It is approach to create loosely coupled services which can be developed, deployed, and maintained independently. Each of these services is responsible for discrete task and can communicate with other services through simple APIs to solve a larger complex business problem.

Key Benefits of a Microservices Architecture
-Small teams can work independently if required
-Can deployed independently
-improved fault isolation
– Service can be deployed only for the respective service instead of redeploying an entire application.
-technology stack can be different.

Some points to think about

1. How to Decompose
Create services based on business capabilities.
For example, the business capabilities for an online shopping application might include the following..

● Product Catalog Management
● Inventory Management
● Order Management
● Delivery Management
● User Management
● Product Reviews Management

2. Design the Individual Services Carefully
-When designing the services, carefully define them and think about what will be exposed, what protocols will be used to interact with the service, etc.

Example -We are taking a service (Service 1) and storing all of the information needed by the service to a database. When another service (Service 2) is created which needs that same data, we access that data directly from the database.
Now if schema needs to change, flexibility to make change is lost.
Alternate way – Service 2 should access Service 1 and avoid going directly to the database, therefore preserving utmost flexibility for various schema changes that may be required

3. Building and Deploying
-Creating services using best suited technology.
-Automated test cases
-Deploy services.

4. Deploy
It’s important to write Consumer Driven Contracts for any API that is being depended upon. This is to ensure that new changes in that API don’t break your API.
Two models for deployment
1. multiple microservices per operating system
2. One Microservice Per Operating System (using Hypervisors whereby multiple virtual machines are provisioned on the same host). Docker is one implementation of that model

Making Changes to Existing Microservice APIs While In Production
-version your API (Downside: maintain versions. Any new changes or bug fixes must be done in both the versions)
-Another way another end point is implemented in the same service when changes are needed.

5. Decentralize Things
– Have one team who develop, deploy, maintain and support it
-Another way is developer who needs changes in a service can check out the code, work on a feature, and get it reviewed instead of waiting for the service owner to pickup and work on needed changes.

6. Making Standards
-Best practices
-Error handling

Service Dependencies
In a microservices architecture, over time, each service starts depending on more and more services. This can introduce more problems as the services grow, for example, the number of service instances and their locations (host+port) might change dynamically. Also, the protocols and the format in which data is shared might vary from service to service.

7. Failure
What’s critical with a microservices architecture is to ensure that the whole system is not impacted or goes down when there are errors in an individual part of the system.
-patterns like Bulkhead (ship compartment) and Circuits Breaker(trip based on threshold)

 

SQL Server

SQL server slow performance
#1: Bad Schema Designing
Poor Normalization – Flat wide tables or over normalization
Redundant data in databases
Bad referential integrity (foreign keys and constraints)
Wide composite primary keys (and clustered indexes)
No stress testing of schema robustness adhering growth patterns

#2: Inefficient T-SQL Queries
Using NOT IN or IN instead of NOT EXISTS or EXISTS
Using cursors or white loop instead of INSERT…SELECT or SELECT…INTO TABLE
Using SELECT * instead of only necessary column names
Nesting of subqueries creating a complex execution plan
Using functions on Indexed Column in WHERE clause
Datatype mismatch in predicates (where condition or joins)
Interchanging usage of UNION vs UNION ALL
Unnecessary using DISTINCT everywhere
Dynamic SQL

#3: Poor Indexing Strategies
Indexing every single foreign key
Indexing every column in the table
Many single-column indexes
Preferring Heap Table over the Clustered index
Underindexing your table
Not maintaining your index

#4: Incorrect Server settings
Keeping Maximum Degree of Parallelism to 0
Not setting the index level to fill factor
Lower Filegrwoth
Single TempDB files
Hosting SQL Server data and log files together
Running antivirus on SQL Server files
Incorrect value in Max Memory Configuration
High latency for your log files

#5: Hardware issue

Why is my SQL Server Query Suddenly Slow?
1. Look for most expensive queries running in SQL Server, over the period of slowdown
2. Check the query plan and query execution statistics and wait types for those query
3. Review the Query history over the period to see where performance changed
4. Check usage over the periods of “normal” and “bad” performance, and see what changed.
5. Diagnosis and query tuning

Index scan vs Index seek
Index scan means it retrieves all the rows from the table and index seek means it retrieves selective rows from the table. INDEX SCAN: Index Scan touches every row in the table it is qualified or not, the cost is proportional to the total number of rows in the table.

Blazor (WebAssembly)

What is Blazor?

Blazor is a framework for building interactive client-side web UI with .NET. Blazor is based on open web standards. It is being developed by Microsoft.

As long as the browsers support those open web standards, they will also support Blazor.

Even if Blazor disappears, you will be able to transfer a lot of knowledge that you gain by learning Blazor to other technologies. HTML and CSS are technologies that are around for 20+ years, and there is no sign that they will disappear anytime soon.

Also, C# is a very popular programming language that can be used for building many different types of applications, not only Blazor web applications.

Use:
Create rich interactive UI with C#
Render UI as HTML and CSS
Integrate with modern hosting platform ( Docker)
Can be shared and distributed as Razor class libraries or NuGet packages.

Blazor WebAssembly:
-Blazor WebAssembly is a single-page app (SPA) framework for building interactive client-side web apps with .NET.
-Running .NET code inside web browsers is made possible by WebAssembly
-WebAssembly is a compact bytecode format optimized for fast download and maximum execution speed

How it works?
When a Blazor WebAssembly app is built and run in a browser:
-C# code files and Razor files are compiled into .NET assemblies.
-The assemblies and the .NET runtime are downloaded to the browser.
-Blazor WebAssembly bootstraps the .NET runtime and configures the runtime to load the assemblies for the app
-The Blazor WebAssembly runtime uses JavaScript interop to handle DOM manipulation and browser API calls

Questions –
What is docker?

Artificial intelligence (AI)

Artificial intelligence (AI)

Artificial intelligence (AI) is intelligence demonstrated by machines. AI refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions.

Artificial intelligence is a theory and development of computer systems that can perform tasks that normally require human intelligence. Speech recognition, decision-making, visual perception etc.

Types of AI

Reactive AI

It is programmed to provide a predictable output based on the input it receives
For example Chess/email spam filters.

Limited Memory AI
Limited memory AI learns from the past and builds experiential knowledge by observing actions or data.
For example, autonomous vehicles use limited memory AI to observe other cars’ speed and direction, helping them “read the road” and adjust as needed.

Theory of Mind AI
With this type of AI, machines will acquire true decision-making capabilities that are similar to humans. Machines with theory of mind AI will be able to understand and remember emotions, then adjust behavior based on those emotions as they interact with people.

Self-aware AI
The most advanced type of artificial intelligence is self-aware AI. When machines can be aware of their own emotions, as well as the emotions of others around them, they will have a level of consciousness and intelligence similar to human beings. This type of AI will have desires, needs, and emotions as well.

AI vs Machine learning

Artificial intelligence is a technology that enables a machine to simulate human behavior. Machine learning is a subset of AI which allows a machine to automatically learn from past data without programming explicitly. The goal of AI is to make a smart computer system like humans to solve complex problems.

How Artificial Intelligence and Machine Learning are Revolutionizing Software Development

 

Differences Between .NET Framework, .NET Core, and .NET Standard

.NET is a developer platform made up of tools, programming languages, and libraries for building many different types of applications.

.NET Framework

.NET Framework is the original implementation of .NET. It supports running websites, services, desktop apps, and more on Windows. .NET Framework 4.8 is the last version of .NET Framework

.NET Core 

.NET Core is a cross-platform implementation for running websites, services, and console apps on Windows, Linux, and macOS. .NET is open source on GitHub.

Version Latest Version Visual Studio
.NET 5 VS 2019
.NET Core 3.x – latest VS 209
.NET Core 2.x VS 2017, 2019
.NET Core 1.x VS 2017

Both, .NET 3.1, and .NET Core 2.1 will have long term support.
.NET Core 3.x applications only run on .NET Core Framework.
.NET Core 2.x applications run on .NET Core as well as .NET Framework.

.NET Standard

.NET Standard is a formal specification of the APIs that are common across .NET implementations. This allows the same code and libraries to run on different implementations.

.Net Framework and .Net core have different BCLs, hence .Net Framework library is not compatible with .Net core and vice versa. To solve this problem Microsoft introduced .Net standard.

For example, to develop a library that supports .NetFramework 4.5.1 and .Net Core 1.0, we need to target .Net Standard 1.2 i.e. the lowest .NetStandard version that the 2 frameworks implement.

Xamarin/Mono 

Xamarin/Mono is a .NET implementation for running apps on all the major mobile operating systems, including iOS and Android.

C# Version History

Version .NET Framework Visual Studio Important Features
C# 1.0 .NET Framework 1.0/1.1 Visual Studio .NET 2002 First release of C#
C# 2.0 .NET Framework 2.0 Visual Studio 2005
  • Generics
  • Partial types
  • Anonymous methods
  • Nullable types
  • Iterators
  • Covariance and contravariance
C# 3.0 .NET Framework 3.0\3.5 Visual Studio 2008
  • Auto-implemented properties
  • Anonymous types
  • Query expressions
  • Lambda expression
  • Expression trees
  • Extension methods
C# 4.0 .NET Framework 4.0 Visual Studio 2010
  • Dynamic binding
  • Named/optional arguments
  • Generic covariant and contravariant
  • Embedded interop types
C# 5.0 .NET Framework 4.5 Visual Studio 2012/2013
  • Asynchronous members
  • Caller info attributes
C# 6.0 .NET Framework 4.6 Visual Studio 2013/2015
  • Static imports
  • Exception filters
  • Property initializers
  • Expression bodied members
  • Null propagator
  • String interpolation
  • nameof operator
  • Dictionary initializer
C# 7.0 .NET Core Visual Studio 2017
  • Improved performance and productivity
  • Azure Support
  • AI Support
  • Game development
  • Cross platform
  • Mobile App Development
  • Window App Development

 

List of .NET Dependency Injection Frameworks

What is Dependency injection?

Dependency injection is a design pattern in which an object receives other objects that it depends on. It is way of doing loosely coupled programming. It separates concern of constructing an object. It is technique to achieve inversion of control between classes and their objects.

List of .NET Dependency Injection Frameworks

Castle Windsor – Castle Windsor is best of breed, mature Inversion of Control container available for .NET and Silverlight. Windsor is part of an entire stack which includes Monorail, Active Record, etc. NHibernate itself builds on top of Windsor.
Unity – Lightweight extensible dependency injection container with support for constructor, property, and method call injection. Supported by Microsoft.
Spring.Net – Spring.NET is an open source application framework that makes building enterprise .NET applications easier.
VS MEF – Managed Extensibility Framework (MEF) implementation used by Visual Studio.

Other frameworks
Autofac ,DryIoc, Ninject, Lamar ,LightInject,Simple Injector,Microsoft.Extensions.DependencyInjection,Scrutor,TinyIoC,Stashbox, Structure Map

Apache Kafka Introduction

Apache Kafka Introduction

Apache Kafka is an event streaming platform.
3 features –
To publish (write) and subscribe to (read) streams of events
To store streams of events durably and reliably for as long as you want
To process streams of events as they occur or retrospectively.

What is event streaming?
It is way of capturing data in real-time from event sources like databases, sensors, mobile devices, s/w applications etc.
-storing these events
-processing these events
-routing these events
Event streaming thus ensures a continuous flow and interpretation of data so that the right information is at the right place, at the right time.

Uses :
To capture and analyze sensor data from IoT devices in factories
To capture payments
To capture financial transactions
To track and monitor cars, trucks, shipment etc.

Working:
Kafka is a distributed system consisting of servers and clients that communicate via a high-performance TCP network protocol.
Servers: Kafka is run as a cluster of one or more servers that can span multiple datacenters or cloud regions. Some of these servers form the storage layer, called the brokers
-Other servers run Kafka Connect to continuously import and export data
Clients: They allow you to write distributed applications and microservices that read, write, and process streams of events in parallel, at scale, and in a fault-tolerant manner even in the case of network problems or machine failures.

When some event happens that is recorded. It is also called record or message. Event will have key, value and timestamp.
Event key: “Prakash”
Event value: “Paid Rs.100 to Vijay”
Event timestamp: “May. 25, 2022 at 7:06 p.m.”

Producers are those client applications that publish (write) events to Kafka, and consumers are those that subscribe to (read and process) these events.
Events are organized and durably stored in topics. (folders in filesystem and events are files in that folders).
Topics are partitioned, meaning a topic is spread over a number of “buckets” located on different Kafka brokers.
APIs
-The Admin API to manage and inspect topics, brokers, and other Kafka objects.
-The Producer API to publish (write) a stream of events to one or more Kafka topics.
-The Consumer API to subscribe to (read) one or more topics and to process the stream of events produced to them.
-The Kafka Streams API to implement stream processing applications and microservices.
-The Kafka Connect API to build and run reusable data import/export connectors that consume (read) or produce (write) streams of events from and to external systems and applications so they can integrate with Kafka.

 

Cloud Computing

Cloud Computing

-Pay for what you need
-Cloud computing is the on-demand delivery of IT resources over the internet with pay-as-you-go pricing
-300 virtual servers or 2000 terabytes of storage

The three cloud computing deployment models are cloud-based, on-premises, and hybrid.

Cloud-based

Run all parts of the application in the cloud.
Migrate existing applications to the cloud.
Design and build new applications in the cloud.

on-premises

On-premises deployment is also known as a private cloud deployment.
In this model, resources are deployed on premises by using virtualization and resource management tools.
For example, you might have applications that run on technology that is fully kept in your on-premises data center. Though this model is much like legacy IT infrastructure, its incorporation of application management and virtualization technologies helps to increase resource utilization.

hybrid

Connect cloud-based resources to on-premises infrastructure.
In a hybrid deployment, cloud-based resources are connected to on-premises infrastructure. You might want to use this approach in a number of situations.
For example, you have legacy applications that are better maintained on premises, or government regulations require your business to keep certain records on premises.

Advantages:
-Cost saving
-Stop spending money to run and maintain data centers
-No need to guess capacity
-Increase speed and agility

Cloud computing service categories:

Saas (Software as a Service)

In this case third party providers host applications and make them available to customers on internet
Examples: Salesforce, Concur

Pass (Platform as a Service)

In this case third party providers hosts application development platforms and tools on its own infrastructure and make them available to customers on internet.

Examples: Google App Engine, AWS Elastic Beanstalk

Iaas (Infrastructure as a Service)

In this case third party providers host servers, storage and other virtual resources and make them available to customers on internet.

EC2

When you’re working with AWS, those servers are virtual and service you use to gain access to virtual servers is called Ec2. EC2 runs on top of physical host machines managed by AWS using virtualization technology. When you spin up an EC2 instance, you aren’t necessarily taking an entire host to yourself. Instead, you are sharing the host with multiple other instances, otherwise known as virtual machines. When you provision an EC2 instance, you can choose the operating system based on either Windows or Linux. You can provision thousands of EC2 instances on demand

Advantage

-Cost effective
-highly flexible
-Quick
-You can easily stop or terminate the EC2 instances

Different types of EC2 instances

The different instance families in EC2 are general purpose, compute optimized, memory optimized, accelerated computing, and storage optimized.

General purpose instances provide a good balance of compute, memory, and networking resources, and can be used for a variety of diverse workloads like web service or code repositories.

application servers
gaming servers
backend servers for enterprise applications
small and medium databases

Compute optimized instances are ideal for compute-intensive tasks like gaming servers, high performance computing or HPC, and even scientific modeling.

memory optimized instances are good for memory-intensive tasks. Accelerated computing are good for floating point number calculations, graphics processing, or data pattern matching, as they use hardware accelerators.

Storage optimized instances are designed for workloads that require high, sequential read and write access to large datasets on local storage. Examples of workloads suitable for storage optimized instances include distributed file systems, data warehousing applications, and high-frequency online transaction processing (OLTP) systems.