We are leading community-driven open source projects. We love to develop reusable software libraries/frameworks/tools, distributed architectures/systems, multi-threaded and real-time applications.
Our team is mainly experienced on Microsoft technologies (C#, ASP.NET Core, Blazor, Entity Framework Core, Typescript), JavaScript frameworks (Angular, React, Vue), web development/design tools, database management systems (SQL Server, MySQL, MongoDB) and mobile development (React Native, Xamarin).
Recent blog posts by Volosoft
<div style="font-size:small;text-align:center">Photo by <a href="https://unsplash.com/@willyin?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">yinka adeoti</a> on <a href="https://unsplash.com/s/photos/knobs?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a></div> [ABP Framework](https://abp.io/) Angular UI has some built-in modules, each of which is a separate npm package. Every package has an exported module that can be loaded by the router. When loaded, these modules introduce several pages to the application via child routes. To allow a degree of customization, we have an extensibility system, and this requires configuration. So, a question arises: How can we configure modules loaded by `loadChildren`? It may look like a requirement specific to our use case, but I believe the method described here may help other subjects. ## The Setup Here is a mere module that attaches a simple component to the path the router loads it. ```js import { Component, Inject, InjectionToken, NgModule } from "@angular/core"; import { RouterModule } from "@angular/router"; export const FOO = new InjectionToken<string>("FOO"); @Component({ selector: "app-foo", template: "{{ foo }} works!" }) export class FooComponent { constructor(@Inject(FOO) public readonly foo: string) {} } @NgModule({ imports: [ RouterModule.forChild([ { path: "", component: FooComponent } ]) ], declarations: [FooComponent], providers: [ { provide: FOO, useValue: "foo" } ] }) export class FooModule {} ``` Let's load this module with the router. ```js import { Component, NgModule } from "@angular/core"; import { BrowserModule } from "@angular/platform-browser"; import { RouterModule } from "@angular/router"; import { FooModule } from "./foo.module"; @Component({ selector: "app-root", template: "<router-outlet></router-outlet>" }) export class AppComponent {} @NgModule({ imports: [ BrowserModule, RouterModule.forRoot([ { path: "", loadChildren: () => FooModule } ]) ], declarations: [AppComponent], bootstrap: [AppComponent] }) export class AppModule {} ``` Do you wish lazy-loading? Well, then we need to do this: ```js // remove FooModule from imports @NgModule({ imports: [ BrowserModule, RouterModule.forRoot([ { path: "", loadChildren: () => import('./foo.module').then(m => m.FooModule), } ]) ], declarations: [AppComponent], bootstrap: [AppComponent] }) export class AppModule {} ``` Voila: "foo works!". Simple, isn't it? So far, there is nothing impressive but bear with me. ## The Twist Now, let's make a small change to introduce some configuration options. ```js // other imports are removed for brevity import { ModuleWithProviders } from "@angular/core"; @NgModule({ imports: [ RouterModule.forChild([ { path: "", component: FooComponent } ]) ], declarations: [FooComponent] }) export class FooModule { static withOptions(foo = "foo"): ModuleWithProviders<FooModule> { return { ngModule: FooModule, providers: [ { provide: FOO, useValue: foo } ] }; } } ``` You have probably used or created a few modules with [the forRoot pattern](https://angular.io/guide/singleton-services#the-forroot-pattern) before. What we did is here similar: We used a static method called `withOptions` to make our module configurable. However, we cannot call this method in `loadChildren`. Eager or lazy, while loading modules with Angular router, we need to return an `NgModuleFactory`, not `ModuleWithProviders`. Let's fix this. ## The Factory Angular core package exports an abstract class called `NgModuleFactory`. We will extend it, and implement the abstract methods to convert a `ModuleWithProviders` to a module factory for the router. ```js import { Compiler, Injector, ModuleWithProviders, NgModuleFactory, NgModuleRef, StaticProvider, Type } from "@angular/core"; export class ChildModuleFactory<T> extends NgModuleFactory<T> { get moduleType(): Type<T> { return this.moduleWithProviders.ngModule; } constructor(private moduleWithProviders: ModuleWithProviders<T>) { super(); } create(parentInjector: Injector | null): NgModuleRef<T> { const injector = Injector.create({ parent: parentInjector, providers: this.moduleWithProviders.providers as StaticProvider[] }); const compiler = injector.get(Compiler); const factory = compiler.compileModuleSync(this.moduleType); return factory.create(injector); } } ``` We pass a `ModuleWithProviders` to the `ChildModuleFactory` constructor. The following steps happen when the `RouterConfigLoader` [calls](https://github.com/angular/angular/blob/master/packages/router/src/router_config_loader.ts#L42) the `create` method: - A new injector is created, using the parent injector and providers from the `ModuleWithProviders` in the process. - `Compiler` is retrieved via that injector. - `compileModuleSync` method accesses the `moduleType` property, which returns the `ngModule` from the `ModuleWithProviders`, and creates a factory with it. - Finally, the module is created using that `factory`. ## The Solution Now we will put this factory in use. We are going to add a new static method. ```js // other imports are removed for brevity import { NgModuleFactory } from "@angular/core"; import { ChildModuleFactory } from "./child-module-factory"; @NgModule(/* module metadata is removed for brevity */) export class FooModule { static withOptions(foo = "foo"): ModuleWithProviders<FooModule> { return { ngModule: FooModule, providers: [ { provide: FOO, useValue: foo } ] }; } static asChild(...params: FooOptions): NgModuleFactory<FooModule> { return new ChildModuleFactory(FooModule.withOptions(...params)); } } type FooOptions = Parameters<typeof FooModule.withOptions>; ``` The `asChild` static method takes the `ModuleWithProviders` returned by the `withOptions` method, creates a new instance of `ChildModuleFactory`, and returns it. Guess what? We can call this method inside `loadChildren`. ```js @NgModule({ imports: [ BrowserModule, RouterModule.forRoot([ { path: "", loadChildren: () => import("./foo.module").then(m => m.FooModule.asChild("bar")) } ]) ], declarations: [AppComponent], bootstrap: [AppComponent] }) export class AppModule {} ``` What we see on the screen is now "bar works!". Nice! Besides, you can load the module eagerly and still configure it. ```js // other imports are removed for brevity import { FooModule } from "./foo.module"; @NgModule({ imports: [ BrowserModule, RouterModule.forRoot([ { path: "", loadChildren: () => FooModule.asChild("bar"), } ]) ], declarations: [AppComponent], bootstrap: [AppComponent] }) export class AppModule {} ``` ## Conclusion Of course, you can create a wrapper module and pass options through it. Or, maybe you choose to provide tokens directly, and that's perfectly fine. I have described nothing special, but it's a clean and reusable way to achieve loading configurable Angular modules. I have created [a StackBlitz playground](https://stackblitz.com/edit/angular-configure-modules-loaded-by-router?file=src%2Fapp%2Fchild-module-factory.ts) where you can see the `ChildModuleFactory` in action. Thanks for reading. Have a beautiful day.
We’re pleased to be a community partner of China’s **.NET Conf 2020** which was organized jointly by many .NET communities in China. The conference was held in **Suzhou, China** and online (due to pandemic situations) on **19–20 December 2020.**  <br> The estimated number of online and physical participants reached **hundred of thousands**, which covers .NET developers, communities, and institutions in 10+ cities such as Beijing, Shanghai, Shenzhen, Guangzhou, Changsha, Chengdu, Xiamen, Jiaodong, and others. Nearly 500 developers were at the conference venue and 100,000 developers watched live online. It was a big gathering of the Chinese .NET population! **Theme topics:** Open-source, Sharing, Innovation **Venue:** Suzhou Artificial Intelligence Industrial Park (offline)   <center> .NET Conf China </center> <br> <center> Sponsors of the conference </center> <br> Some of the partners of the conference are Microsoft Azure, Microsoft Reactor, GrapeCity, .NET Core Community, Turing Community. The conference had strong sessions & speakers! A total of 47 sessions had been conducted by more than 40 technical experts who all have unique skills sharing their professional and cutting-edge technical knowledge in the .NET field. Some of the most known of the speakers were: [Scott Hanselman ](https://www.linkedin.com/in/scotthanselman/)— Partner Program Manager at Microsoft [Mingqiang Xu](https://www.linkedin.com/in/mingqiang-xu-52a48110/) — CTO at Microsoft Omnichannel Business Unit [Jia Woei Ling](https://www.linkedin.com/in/jiawoeiling) — General Manager at Microsoft Azure Scott Hanselman connected the conference online to send greetings and new year wishes to the developers! <center> Scott Hanselman speaking online </center> <br> We had the opportunity to increase the visibility and reputation of ABP Platform with our booth for those who want to learn more about [**ABP**](https://abp.io/). Our China team was happy to show them the ABP Platform in detail, show sample applications built with ABP, demo, and answer any questions. This was a great opportunity for conference attendees to see the power of ABP and how it provides a complete microservice architecture and strong infrastructure and how its open-source and commercial modules allow them to develop reusable application modules. **To give you a glimpse into what ABP Platform is:** > *[**ABP**](https://abp.io/) is an open source and community driven web application framework for ASP.NET Core. It provides an excellent infrastructure to write maintainable, extensible & testable code based on DDD patterns and industry best practices. Furthermore, [**ABP Commercial**](https://commercial.abp.io/) is a complete web development platform based on the ABP framework. It is the perfect base infrastructure and right accelerator for SaaS & enterprise-grade ASP.NET Core based web applications and provides pre-built application modules, productivity tools, modern UI themes, premium support & more.* >  <center> ABP China Team </center> <br>  <center> Registration desk </center> <br>  <center> Developers showed great interest in ABP’s features. </center> <br>  <center> We answered all questions regarding a modern web application challenges. </center> <br> **And we have delivered some gifts to lucky developers!**    <br> The event was streamed live online and can be watched anytime at [https://segmentfault.com/area/dotnetconf-2020 ](https://segmentfault.com/area/dotnetconf-2020) The conference official website: https://dotnetconf.cn/ All conference-related tweets can be found with #[dotNETConfChina2020](https://twitter.com/hashtag/dotNETConfChina2020?src=hashtag_click) It was a great pleasure to be there to meet and communicate with Chinese .NET people! **See you next time China!**
We recently added multi-tenancy support to the social login system in our [ASP.NET Zero](https://aspnetzero.com/) project. ASP.NET Zero supports Facebook, Google, Microsoft, Twitter, OpenId Connect and WsFederation login options. Normally social logins do not support multi-tenancy by default. In this article, I will show you how to set these social login options per tenant. #### First of all, What is Multi-Tenancy? *“Software* *Multitenancy* *refers to a software* *architecture* *in which a* *single instance* *of a software runs on a server and serves multiple tenants. A tenant is a group of users who share common access with specific privileges to the software instance. With a multitenant architecture, a software application is designed to provide every tenant a dedicated share of the instance including its data, configuration, user management, tenant individual functionality, and non-functional properties. Multitenancy contrasts with multi-instance architectures, where separate software instances operate on behalf of different tenants*” ([Wikipedia](https://en.wikipedia.org/wiki/Multitenancy)) All social logins store settings in options. And uses IOptionMonitor to get current settings. For example, Facebook stores login settings in `FacebookOptions` option. And uses `IOptionsMonitor` ```csharp public class FacebookHandler : OAuthHandler<FacebookOptions> { public FacebookHandler(IOptionsMonitor<FacebookOptions> options, ILoggerFactory logger, UrlEncoder encoder, ISystemClock clock) : base(options, logger, encoder, clock) { } ///... ``` (See: [here](https://github.com/dotnet/aspnetcore/blob/master/src/Security/Authentication/Facebook/src/FacebookHandler.cs#L22)) That's why we should create a class that we can replace with `IOptionsMonitor` . Since all of our social logins will need the same patter I will create an abstract class that inherits OptionMonitor, then I will inherit that abstract class for all of the social logins. <script src="https://gist.github.com/demirmusa/61a9ded9f2adbb3bb8c69f0441f4b702.js"></script> After that, we can create tenant-based option providers. ### **Facebook** For Facebook, you should store AppId and AppSecret by the tenant. <script src="https://gist.github.com/demirmusa/0d6df3f543e821e580c6ba792fc83ab9.js"></script> Then we can create `TenantBasedFacebookOptions` <script src="https://gist.github.com/demirmusa/341fd8e54379753e8b7157549adf2d37.js"></script> Here is the key point. Before you add Facebook authentication builder, you have to implement `IOptionsMonitor` with `TenantBasedFacebookOptions` as a singleton. *Startup.cs* <script src="https://gist.github.com/demirmusa/4f329c7906d2a6b29896446f884eef4b.js"></script> Then your project will support tenant-based social login settings. Your host and tenants can change their settings in runtime. Your application will work with current settings. ### Twitter <script src="https://gist.github.com/demirmusa/b1ad581b711aa66b2720fdcd8a759b61.js"></script> ### Google <script src="https://gist.github.com/demirmusa/b32fcf9cbadf41ede45aa98092480ed4.js"></script> ### Microsoft <script src="https://gist.github.com/demirmusa/b244208af3e271486b6b8ce87b939241.js"></script> ### OpenID Connect <script src="https://gist.github.com/demirmusa/758d14e79b31ac83fe91ea69fffa621c.js"></script> ### WsFederation <script src="https://gist.github.com/demirmusa/bae30f64efc0ec2c12f654613bc8d04f.js"></script> Thanks for reading. Please share your thoughts on this article in the comment section below. 😊 * * * ## Read More: 1. [Real-Time Messaging In A Distributed Architecture Using ABP, SignalR & RabbitMQ](https://volosoft.com/blog/RealTime-Messaging-Distributed-Architecture-Abp-SingalR-RabbitMQ) 2. [ASP.NET Core 3.1 Webhook Implementation Using Pub/Sub](https://volosoft.com/blog/ASP.NET-CORE-3.1-Webhook-Implementation-Using-Pub-Sub) 3. [Using Azure Key Vault with ASP.NET Core](https://volosoft.com/blog/Using-Azure-Key-Vault-with-ASP.NET-Core)
## Introduction This step-by-step article describes how to upload a file to a Web server and also download by client with using ASP.NET Core & ABP Framework. By following this article, you will create a web project and its related code to upload and download files. Before the creating application, we need to know some fundamentals. ## BLOB Storing It is typical to **store file contents** in an application and read these file contents on need. Not only files, but you may also need to save various types of **large binary objects**, a.k.a. [BLOB](https://en.wikipedia.org/wiki/Binary_large_object)s, into a **storage**. For example, you may want to save user profile pictures. A BLOB is a typically **byte array**. There are various places to store a BLOB item; storing in the local file system, in a shared database or on the [Azure BLOB storage](https://azure.microsoft.com/en-us/services/storage/blobs/) can be options. The ABP Framework provides an abstraction to work with BLOBs and provides some pre-built storage providers that you can easily integrate to. Having such an abstraction has some benefits; - You can **easily integrate** to your favorite BLOB storage provides with a few lines of configuration. - You can then **easily change** your BLOB storage without changing your application code. - If you want to create **reusable application modules**, you don't need to make assumption about how the BLOBs are stored. ABP BLOB Storage system is also compatible to other ABP Framework features like [multi-tenancy](https://docs.abp.io/en/abp/latest/Multi-Tenancy). To get more information about ABP BLOB Storing system, please check this [documentation](https://docs.abp.io/en/abp/latest/Blob-Storing). ## Preparing the Project ### Startup template and the initial run Abp Framework offers startup templates to get into the business faster. We can download a new startup template using [Abp CLI](https://docs.abp.io/en/abp/latest/CLI): `abp new FileActionsDemo -m none` After the download is finished, we run `FileActionsDemo.DbMigrator` project to create the database and seed initial data (admin user, role, etc). Then we run `FileActionsDemo.Web` to see our application working. > _Default admin username is **admin** and password is **1q2w3E\***_  ### Adding Blob Storing Module For this article, we use [Blob Storing Database Provider](https://docs.abp.io/en/abp/latest/Blob-Storing-Database). You can use [Azure](https://docs.abp.io/en/abp/latest/Blob-Storing-Azure) or [File System](https://docs.abp.io/en/abp/latest/Blob-Storing-File-System) providers also. Open a command prompt (terminal) in the folder containing your solution (.sln) file and run the following command: `abp add-module Volo.Abp.BlobStoring.Database` This action will add the module depencies and also module migration. After this action, run `FileActionsDemo.DbMigrator` to update the database. ### Setting up Blob Storaging BLOB Strorage system works with `Containers`. Before the using blob storage, we need to create our blob container. Create a class that name `MyFileContainer` at the `FileActionsDemo.Domain` project. ```csharp using Volo.Abp.BlobStoring; namespace FileActionsDemo { [BlobContainerName("my-file-container")] public class MyFileContainer { } } ``` That's all, we can start to use BLOB storing in our application. ## Creating Application Layer Before the creating Application Service, we need to create some [DTO](https://docs.abp.io/en/abp/latest/Data-Transfer-Objects)s that used by Application Service. Create following DTOs in `FileActionsDemo.Application.Contracts` project. - `BlobDto.cs` ```csharp namespace FileActionsDemo { public class BlobDto { public byte[] Content { get; set; } public string Name { get; set; } } } ``` - `GetBlobRequestDto.cs` ```csharp using System.ComponentModel.DataAnnotations; namespace FileActionsDemo { public class GetBlobRequestDto { [Required] public string Name { get; set; } } } ``` - `SaveBlobInputDto.cs` ```csharp using System.ComponentModel.DataAnnotations; namespace FileActionsDemo { public class SaveBlobInputDto { public byte[] Content { get; set; } [Required] public string Name { get; set; } } } ``` Create `IFileAppService.cs` interface at the same place with DTOs. - `IFileAppService` ```csharp using System.Threading.Tasks; using Volo.Abp.Application.Services; namespace FileActionsDemo { public interface IFileAppService : IApplicationService { Task SaveBlobAsync(SaveBlobInputDto input); Task<BlobDto> GetBlobAsync(GetBlobRequestDto input); } } ``` After creating DTOs and interface, `FileActionsDemo.Application.Contracts` project should be like as following image.  Then we can create our Application Service. Create `FileAppService.cs` in `FileActionsDemo.Application` project. ```csharp using System.Threading.Tasks; using Volo.Abp.Application.Services; using Volo.Abp.BlobStoring; namespace FileActionsDemo { public class FileAppService : ApplicationService, IFileAppService { private readonly IBlobContainer<MyFileContainer> _fileContainer; public FileAppService(IBlobContainer<MyFileContainer> fileContainer) { _fileContainer = fileContainer; } public async Task SaveBlobAsync(SaveBlobInputDto input) { await _fileContainer.SaveAsync(input.Name, input.Content, true); } public async Task<BlobDto> GetBlobAsync(GetBlobRequestDto input) { var blob = await _fileContainer.GetAllBytesAsync(input.Name); return new BlobDto { Name = input.Name, Content = blob }; } } } ``` As you see in previous code block, we inject `IBlobContainer<MyFileContainer>` to our app service. It will handle all blob actions for us. - `SaveBlobAsync` method uses `SaveAsync` of `IBlobContainer<MyFileContainer>` to save the given blob to storage, this is a simple example so we don't check is there any file exist with same name. We sent blob name, blob content and `true` for `overrideExisting` parameter. - `GetBlobAsync` method is uses `GetAllBytesAsync` of `IBlobContainer<MyFileContainer>` to get blob content by name. We finished the application layer for this project. After that we will create a `Controller` for API and `Razor Page` for UI. ## Creating Controller Create `FileController.cs` in your `FileActionsDemo.HttpApi` project. ```csharp using System.Threading.Tasks; using Microsoft.AspNetCore.Mvc; using Volo.Abp.AspNetCore.Mvc; namespace FileActionsDemo { public class FileController : AbpController { private readonly IFileAppService _fileAppService; public FileController(IFileAppService fileAppService) { _fileAppService = fileAppService; } [HttpGet] [Route("download/{fileName}")] public async Task<IActionResult> DownloadAsync(string fileName) { var fileDto = await _fileAppService.GetBlobAsync(new GetBlobRequestDto{ Name = fileName }); return File(fileDto.Content, "application/octet-stream", fileDto.Name); } } } ``` As you see, `FileController` injects `IFileAppService` that we defined before. This controller has only one endpoint. `DownloadAsync` is using to send file from server to client. This endpoint is requires only a `string` parameter, then we use that parameter to get stored blob. If blob is exist, we return a `File` result so download process can start. ## Creating User Interface We will create only one page to prove download and upload actions are working. Create folder that name `Files` in your `Pages` folder at `FileActionsDemo.Web` project. Create a Razor page that name `Index` with its model. - `Index.cshtml.cs` ```csharp using System.ComponentModel.DataAnnotations; using System.IO; using System.Threading.Tasks; using Microsoft.AspNetCore.Http; using Microsoft.AspNetCore.Mvc; using Volo.Abp.AspNetCore.Mvc.UI.RazorPages; namespace FileActionsDemo.Web.Pages.Files { public class Index : AbpPageModel { [BindProperty] public UploadFileDto UploadFileDto { get; set; } private readonly IFileAppService _fileAppService; public bool Uploaded { get; set; } = false; public Index(IFileAppService fileAppService) { _fileAppService = fileAppService; } public void OnGet() { } public async Task<IActionResult> OnPostAsync() { using (var memoryStream = new MemoryStream()) { await UploadFileDto.File.CopyToAsync(memoryStream); await _fileAppService.SaveBlobAsync( new SaveBlobInputDto { Name = UploadFileDto.Name, Content = memoryStream.ToArray() } ); } return Page(); } } public class UploadFileDto { [Required] [Display(Name = "File")] public IFormFile File { get; set; } [Required] [Display(Name = "Filename")] public string Name { get; set; } } } ``` As you see, we use `UploadFileDto` as a `BindProperty` and we inject `IFileAppService` to upload files. The `UploadFileDto` is requires a `string` parameter for using as a blob name and a `IFormFile` that sent by user. At the post action (`OnPostAsync`), if everything is well, we use `MemoryStream` to get all bytes from file content. Then we save file with `SaveBlobAsync` method of `IFileAppService`. - `Index.cshtml` ```csharp @page @model FileActionsDemo.Web.Pages.Files.Index @section scripts{ <abp-script src="/Pages/Files/index.js" /> } <abp-card> <abp-card-header> <h3>File Upload and Download</h3> </abp-card-header> <abp-card-body> <abp-row> <abp-column> <h3>Upload File</h3> <hr /> <form method="post" enctype="multipart/form-data"> <abp-input asp-for="UploadFileDto.Name"></abp-input> <abp-input asp-for="UploadFileDto.File"></abp-input> <input type="submit" class="btn btn-info" /> </form> </abp-column> <abp-column style="border-left: 1px dotted gray"> <h3>Download File</h3> <hr /> <form id="DownloadFile"> <div class="form-group"> <label for="fileName">Filename</label><span> * </span> <input type="text" id="fileName" name="fileName" class="form-control "> </div> <input type="submit" class="btn btn-info"/> </form> </abp-column> </abp-row> </abp-card-body> </abp-card> ``` We divided the page vertically, left side will be using for upload and right side will be using for download. We use [ABP Tag Helpers](https://docs.abp.io/en/abp/latest/UI/AspNetCore/Tag-Helpers/Index) to create page. - `index.js` ```javascript $(function () { var DOWNLOAD_ENDPOINT = "/download"; var downloadForm = $("form#DownloadFile"); downloadForm.submit(function (event) { event.preventDefault(); var fileName = $("#fileName").val().trim(); var downloadWindow = window.open( DOWNLOAD_ENDPOINT + "/" + fileName, "_blank" ); downloadWindow.focus(); }); $("#UploadFileDto_File").change(function () { var fileName = $(this)[0].files[0].name; $("#UploadFileDto_Name").val(fileName); }); }); ``` This jQuery codes are using for download. Also we wrote a simple code to autofill `Filename` input when user selects a file. After creating razor page and js file, `FileActionsDemo.Web` project should be like as following image.  ## Result After completing code tutorial, run `FileActionsDemo.Web` project and go `/Files`. You can upload any file with any name and also download those uploaded files.  Thanks for reading. Please share your thoughts on this article in the comment section below.😊 * * * ## Read More: 1. [Real-Time Messaging In A Distributed Architecture Using ABP, SignalR & RabbitMQ](https://volosoft.com/blog/RealTime-Messaging-Distributed-Architecture-Abp-SingalR-RabbitMQ) 2. [ASP.NET Core 3.1 Webhook Implementation Using Pub/Sub](https://volosoft.com/blog/ASP.NET-CORE-3.1-Webhook-Implementation-Using-Pub-Sub) 3. [Using Azure Key Vault with ASP.NET Core](https://volosoft.com/blog/Using-Azure-Key-Vault-with-ASP.NET-Core)
There is no doubt that we all want our applications to load very fast. To do this, we need to keep the initial bundle size small. Lazy loading is one of the methods we use in this case. We can lazy load some third-party JavaScript files and some CSS files. In this article, we will learn how to lazy load our CSS files and how to extract them with a hash on production build to prevent browser cache. While developing [ABP Commercial](https://commercial.abp.io/), we were able to load CSS files lazy, but we could not extract these CSS files to the build output with a hash. During my research, I encountered [this issue](https://github.com/angular/angular-cli/issues/12552) and saw that the name of a CSS file that is loaded lazily cannot be hashed. The names of CSS files may not be hashed, but the names of JavaScript files can be hashed. I achieved my goal by importing JavaScript files instead of CSS files.  <div style="font-size:small;text-align:center">Switching between <a href="https://abp.io">ABP</a> Lepton themes</div> Let's see how this work can be done with an application that can switch between [Bootswatch](https://bootswatch.com) themes. I downloaded the minified CSS files of [Materia](https://bootswatch.com/materia), [Journal](https://bootswatch.com/journal), and [Lux](https://bootswatch.com/lux) themes and copied them to the `src/assets/styles` folder. Then I created JavaScript files corresponding to each CSS file. <p align="center"> <img src="/api/blogging/files/www/fc09ea0c2eb6a3e27fda39f644810ff8.png"> </p> <p style="font-size:small;text-align:center;">assets/styles</p> The content of `{theme-name}.js`: ```js import css from "./{theme-name}.min.css"; export default css; ``` The related CSS file has imported and exported as default in each of JavaScript files. Let's see how to load these JavaScript files in our application. We'll create a service and load themes via this service. A service called `SwitchThemeService` can be generated with the following command: ```bash ng generate service switch-theme/switch-theme ``` Replace the `SwitchThemeService` content with below: ```js import { Injectable } from '@angular/core'; @Injectable({ providedIn: 'root', }) export class SwitchThemeService { selectedTheme = 'materia'; insertedElement: HTMLElement; constructor() { this.loadTheme(); } loadTheme() { import( /* webpackChunkName: "[request]" */ `../../assets/styles/${this.selectedTheme}.js` ) .then((s) => s.default) .then(this.insertToDom); } insertToDom = (content: string) => { const element = document.createElement('style'); element.textContent = content; document.head.appendChild(element); if (this.insertedElement) this.insertedElement.remove(); this.insertedElement = element; }; } ``` What we did above: * Defined the `selectedTheme` variable with initial value that is `materia`. * Defined the `insertedElement` variable to keep inserted DOM element. * Defined the `loadTheme` method for lazy loading the Javascript files in `assets/styles` folder. The [import function of the Webpack](https://webpack.js.org/api/module-methods/#import-1) loads the JavaScript modules dynamically and returns the result as a promise. The `default` property of the `import` function result gets us the CSS raw content. Handled the content in the `loadTheme` method and pass it to the `insertDom` method. * Called the `loadTheme` method in constructor to load initial theme on application initialization. Webpack includes JavaScript files which are passed to the import function to the build output. In the comment block above, we have specified the chunk names with a magic comment that is `webpackChunkName`. See the [Magic Comments on Webpack documentation](https://webpack.js.org/api/module-methods/#magic-comments). We will create a component to change the theme on the fly. A component named `SwitchThemeComponent` can be generated with the following command: ```bash ng generate component switch-theme --inlineTemplate ``` Replace the `SwitchThemeComponent` content with the below: ```js import { Component, OnInit } from '@angular/core'; import { SwitchThemeService } from './switch-theme.service'; @Component({ selector: 'app-switch-theme', template: ` <select [(ngModel)]="service.selectedTheme" (ngModelChange)="service.loadTheme()"> <option [ngValue]="'materia'">Materia</option> <option [ngValue]="'journal'">Journal</option> <option [ngValue]="'lux'">Lux</option> </select> `, }) export class SwitchThemeComponent { constructor(public service: SwitchThemeService) {} } ``` Switching between themes is done with the `select` element using SwitchThemeService in the component. Take a look at how it works:  See the generated chunks and hashed chunk names on production build:  See the [live demo](https://mehmet-erim.github.io/lazy-load-hashed-css-files) and the [source code on GitHub](https://github.com/mehmet-erim/lazy-load-hashed-css-files). Follow me on [Twitter](https://twitter.com/mehmeterim_) and [GitHub](https://github.com/mehmet-erim). Thanks for reading. Please share your thoughts on this article in the comment section below.😊 * * * ## Read More: 1. [How to Use Attribute Directives to Avoid Repetition in Angular Templates](https://volosoft.com/blog/attribute-directives-to-avoid-repetition-in-angular-templates) 2. [Strategy Pattern Implementation with Typescript and Angular](https://volosoft.com/blog/strategy-pattern-implementation-with-typescript-and-angular) 3. [What is New in Angular 10?](https://volosoft.com/blog/what-is-new-in-angular-10)
The new version, v10, of Angular has been published only hours ago and [announced by this blog post](https://blog.angular.io/version-10-of-angular-now-available-78960babd41). Although it may not appear as impactful as v9 (with Ivy and all), this release displays Angular team's commitment to keep Angular up-to-date and relevant. We found this very exciting and the timing was just right, because we are about to release [ABP](https://abp.io) v3.0! So, we jumped into the details of what changed and how to migrate. Here is what we found. ### Major Changes #### TypeScript v3.9 Support [_breaking change_] Angular 9 was released with TypeScript 3.7 support. Soon TypeScript 3.8 was released and Angular v9.1 supported it. Not long after, another TypeScript version, 3.9, is released and Angular responds with v10, keeping up, not only with TypeScript, but also with TSLib and TSLint. That is great news. Angular stays up-to-date. First of all, TypeScript 3.9 has performance improvements, which means faster Angular builds, especially in larger projects. Second, all latest TypeScript fixes and features are available to Angular developers. Last, but not the least, Angular developers will be using a more elaborate TypeScript configuration. Earlier versions of TypeScript are no longer supported, so you have to install v3.9 in your project. I believe a major reason behind this is the next feature I will describe. #### "Solution Style” `tsconfig.json` Files "Solution Style” `tsconfig.json` file support was [introduced by TypeScript v3.9](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-9.html#support-for-solution-style-tsconfigjson-files) to overcome issues with cases where a tsconfig.json existed just to reference other tsconfig.json files, known as a "solution". Angular 10 makes use of that concept and **improves IDE support** and, consequently, developer experience. A new file called `tsconfig.base.json` is introduced and what was inside the root `tsconfig.json` before is carried to this new file. You can find further details about the "solution" configuration [here](https://angular.io/guide/typescript-configuration#configuration-files), but basically the new `tsconfig.json` at root level, before and after adding a library to the project, looks like this: ##### BEFORE: ```json { "files": [], "references": [ { "path": "./tsconfig.app.json" }, { "path": "./tsconfig.spec.json" }, { "path": "./e2e/tsconfig.json" } ] } ``` ##### AFTER: ```json { "files": [], "references": [ { "path": "./tsconfig.app.json" }, { "path": "./tsconfig.spec.json" }, { "path": "./e2e/tsconfig.json" }, { "path": "./projects/core/tsconfig.lib.json" }, { "path": "./projects/core/tsconfig.spec.json" } ] } ``` If you upgrade to Angular 10 using `ng update` , the CLI will migrate your workspace to this structure. Earlier versions of TypeScript does not support "solution style", so this may be the reason behind the breaking change described above. #### Angular Package Format Changes & esm5/fesm5 [Angular package format](https://docs.google.com/document/d/1CZC2rcpxffTDfRDs6p1cfbmKNLA6x5O-NtkJglDaBVs/preview) has changed, and the new format does not include `esm5` and `fesm5` distributions anymore. Angular packages (@angular/*) will not include them. Since Angular generates ES5 from ES2015 and ES2015 is the default language level consumed by Angular tooling, those code distributions had become obsolete. The format change is as follows: ##### BEFORE: ```json { ... "main": "bundles/abp-ng.core.umd.js", "module": "fesm5/abp-ng.core.js", "es2015": "fesm2015/abp-ng.core.js", "esm5": "esm5/abp-ng.core.js", "esm2015": "esm2015/abp-ng.core.js", "fesm5": "fesm5/abp-ng.core.js", "fesm2015": "fesm2015/abp-ng.core.js", ... } ``` ##### AFTER: ```json { ... "main": "bundles/abp-ng.core.umd.js", "module": "fesm2015/abp-ng.core.js", "es2015": "fesm2015/abp-ng.core.js", "esm2015": "esm2015/abp-ng.core.js", "fesm2015": "fesm2015/abp-ng.core.js", ... } ``` If your application depends on esm5/fesm5 files, you can relax, because they are still consumable by the build system. Likewise, you do not have to worry if your Angular library does not ship `esm2015` or `fesm2015`, because the CLI will fallback to others. However, in favor of bundle optimization and build speed, it is recommended to deliver ES2015 outputs. #### Browserlist Angular generates bundles based on the [Browserlist](https://github.com/browserslist/browserslist) configuration provided in the root app folder. Angular 10 will look up for a `.browserlistrc` in your app, but fall back to `browserlist` if not found. The `ng update ` command will rename the file for you. ### Breaking Changes * Resolvers that return `EMPTY` will cancel navigation. In order to allow the router to continue navigating to the route, emit some value such as `of(null)`. * Logging of unknown property bindings or element names in templates is increased to "error" level. It was "warning" before. The change may have an effect on tools not expecting an error log. * Returning `null` from a [UrlMatcher](https://angular.io/api/router/UrlMatcher) would throw `Type 'null' is not assignable to type 'UrlMatchResult'.` before. This is fixed, but the return type can now be `null` too. * Reactive forms had a bug which caused `valueChanges` for controls bound to input fields with `number` type to fire twice. The listened evet is changed from "change" to "input" to fix this. * `minLength` and `maxLength` validators validate only if the value has a numeric `length` property. * There was a bug in detection of day span while using `formatDate()` or `DatePipe` and `b` or `B` format code. It is fixed and the output, for instance, will now be "at night" instead of "AM". * Transplanted views (declared in one component and inserted into another) [had change detection issues](https://github.com/angular/angular/pull/35968), but that is fixed now. Detection after detaching and double detection are avoided. ### Deprecations and Removals #### ModuleWithProviders Without a Generic Type [_removed_] Earlier versions of Angular was able to compile static method returns with `ModuleWithProviders` type without the generic type, i.e. `ModuleWithProviders<SomeModule>`, because the generated `metadata.json` files would have the information required for compilation. After Ivy, since `metadata.json` is not required, Angular checks the generic type for type validation. ##### BEFORE: ```js @NgModule({...}) export class SomeModule { static forRoot(): ModuleWithProviders { return { ngModule: SomeModule, providers: [...] }; } } ``` ##### AFTER: ```js @NgModule({...}) export class SomeModule { static forRoot(): ModuleWithProviders<SomeModule> { return { ngModule: SomeModule, providers: [...] }; } } ``` `ModuleWithProviders` without a generic type was deprecated before. As of Angular 10, it is completely removed. #### Undecorated Base Classes [_removed_] If you are taking advantage of inheritance from classes that use Angular features such as dependency injection or Angular decorators , you now need to decorate those base classes as well. Otherwise, Angular will throw an error about the missing decorator on the parent. ##### DEPENDENCY INJECTION: ```js @Directive() export abstract class AbstractSome { constructor(@Inject(LOCALE_ID) public locale: string) {} } @Component({ selector: 'app-some', template: 'Locale: {{ locale }}', }) export class SomeComponent extends AbstractSome {} ``` Here is the error Angular 10 compiler throws when the `Directive` decorator is missing: ```txt The component SomeComponent inherits its constructor from AbstractSome, but the latter does not have an Angular decorator of its own. Dependency injection will not be able to resolve the parameters of AbstractSome's constructor. Either add a @Directive decorator to AbstractSome, or add an explicit constructor to SomeComponent. ``` ##### DECORATOR: ```js @Directive() export abstract class AbstractSome { @Input() x: number; } @Component({ selector: 'app-some', template: 'Value of X: {{ x }}', }) export class SomeComponent extends AbstractSome {} ``` Angular 10 compiler throws a less detailed error this time: ```txt Class is using Angular features but is not decorated. Please add an explicit Angular decorator. ``` I am sure you would not do that, but if you put a `Component` decorator on the parent and remove the decorator on the child, as you would expect, Angular 10 compiler will throw the error below: ```txt The class 'SomeComponent' is listed in the declarations of the NgModule 'AppModule', but is not a directive, a component, or a pipe. Either remove it from the NgModule's declarations, or add an appropriate Angular decorator. ``` #### WrappedValue [_deprecated_] `WrappedValue` is deprecated and will probably be removed with v12. Check [here](https://github.com/angular/angular/blob/d1ea1f4c7f3358b730b0d94e65b00bc28cae279c/packages/core/src/util/WrappedValue.ts) and [here](https://github.com/angular/angular/blob/9.1.x/packages/common/src/pipes/async_pipe.ts#L121) if you have never heard of it before. It was useful to trigger change detection even when same object instance was produced/emitted. There is a performance cost when `WrappedValue` is used and the cases where it helps are relatively rare, so Angular team has decided to drop it. As a side effect of this deprecation, you may see more `ExpressionChangedAfterItHasBeenChecked` errors than before, because Angular would not throw an error when `WrappedValue`s were evaluated as equal. Incase you face change detection issues, try cloning the object or trigger change detection manually via `markForCheck` or `detectChanges` methods of the `ChangeDetectorRef`. #### Other Deprecations & Removals * Support for IE9, IE10, and IE Mobile has been deprecated and will be dropped later. The increased bundle size and complexity was the main reason. Considering even Microsoft dropped support for these browsers, it makes a lot of sense. * Angular stopped sanitizing the style property bindings. This is due to drop of support for legacy browsers (like IE6 and IE7) and the performance cost of having a sanitizer. * Bazel build schematics will not be continued. Angular team explained [the reasons](https://github.com/angular/angular/tree/10.0.x/packages/bazel/src/schematics) and referred to [this monorepo](https://github.com/bazelbuild/rules_nodejs/tree/master/examples/angular) as a source to keep an eye on, if you are interested in building Angular with Bazel. ### Conclusion I would like to emphasize how thankful I am that Angular team is trying to keep Angular up-to-date. This is a great demonstration and, in my humble opinion, just as meaningful as a state-of-art renderer. It is also very nice to see how easy it is to migrate an existing project. Angular not only keeps up with its ecosystem, but also helps you to keep up together with it. Congratulations, and thank you! * * * ## Read More: 1. [How to Use Attribute Directives to Avoid Repetition in Angular Templates](https://volosoft.com/blog/attribute-directives-to-avoid-repetition-in-angular-templates) 2. [Strategy Pattern Implementation with Typescript and Angular](https://volosoft.com/blog/strategy-pattern-implementation-with-typescript-and-angular) 3. [ASP.NET Core Angular Refresh Token Implementation](https://volosoft.com/blog/ASP.NET-Core-Angular-Refresh-Token-Implementation)
We are very pleased to announce that [GetApp](https://getapp.com) has named ASP.NET Zero as one of the best in the Application Development Tools category. The ranking showcases the leading 5 apps based on factors such as: - have more than 10 user reviews - user ratings higher than 4.5 (out of 5) - offer key app development features (at least four out of these seven app development features: compatibility testing, debugging, analytics, integrated development environment, mobile app development, web app development, and software development.) <br> ASP.NET Zero has been placed **#1** on this list, with an overall rating of **4.9/5**. We really appreciate our devoted customers who took the time to share their opinion about ASP.NET Zero, which has helped us receive an award by GetApp — [**5 Best Application Development Tools** ](https://www.getapp.com/resources/best-application-development-tools/)(out of 166 apps) in 2020. > *[ASP.NET Zero](http://aspnetzero.com?utmsource=volosoft&utmmedium=getappannouncement) is dedicated to providing a production-ready enterprise-level application base for ASP.NET Core based applications with tons of useful features that should be in a line of business application and saves more than several months in development time with pre-built pages (authentication, permission management, localization, multi-tenancy, SaaS features, CRUD Page Generator (RAD Tool), themes, etc.) The significant advantage of having ASP.NET Zero in the technology stack is reducing project development costs and speeding up the project development.* > ASP.NET Zero has more than [10 reviews](https://www.getapp.com/development-tools-software/a/asp-net-zero/reviews/) on the GetApp website. Here’s what a few of our “more than satisfied” customers had to say: *“We create fast to market web applications for our partners and clients using ASP.NET Zero. It’s a perfect solution for this. Even the first non-functional prototypes look great from the start. The standard features are a welcome starting point. You can start adding your own functionality right away without worrying about authentication, user/role management, logging, etc.” **Jeroen Guldemond, Technical Consultant & Co-Owner*** *“It allows us to accelerate our system development process, without forgetting the quality and good programming practices.” **Douglas Bustos, CEO*** *“ASP.NET Zero helped us in quickly starting the project. Glad that we chose ASP.NET Zero. It is like hiring a dedicated team of 10 developers working for us for almost free! It saved at least 6 months of initial effort. Repository pattern is handy.” **Ajay Kumar, Senior Developer*** We are never satisfied, even at the top. The appreciation of our happy customers pushes us to continue to improve ASP.NET Zero, add more features, and make it as user-friendly as possible. We cannot thank our customers enough for their trust. We promise that everything will only get better over time. If you haven’t done it yet, it’s not too late to **[write a review on GetApp](https://reviews.getapp.com/new/133667)** and share the love for your favorite application development software! * * * #### **What makes ASP.NET Zero one of the best application development software?** <br>  <br> <br> - **SOLID Architecture**: The solution provides clean source code in a well-defined solution structure, layered, modular, and Domain-Driven Design implemented. <br> <br> - **Well-documented**: It has comprehensive documentation, it will help a lot while developing your application. So your developers will know how to write code that follows best practices. <br> <br> - **Mature framework**: The solution is **mature** and has solved most of the issues faced when creating an application from scratch. <br> <br> - **Cross-cutting concerns implemented:** It provides many **pre-built functionalities** that are common to almost every application: login, user, role and permission management, audit logs, settings, user profile, multi-language, multi-tenancy, and so on. So you will no longer spend time developing basic functionality and you can immediately proceed to develop business logic and unique solutions. <br> <br> - **Full source code included**: It provides the **full source code** hence you can fully customize it as per your business requirements. <br> <br> - **Open-source based & built with best practices**: It’s based on the open-source and community-driven [ABP](http://aspnetboilerplate.com?utm_source=volosoft&utm_medium=getappannouncement) framework. It is actively used by thousands of developers and is continuously developed. It makes your daily development easier by providing base classes and infrastructure and automates your repeated tasks. <br> <br> - **Modern UI**: Its UI is based on [Metronic](https://keenthemes.com/metronic/) Theme (the world’s most trusted UI theme), which allows you to get a clean and precise front-end. <br> <br>  Here are some other ASP.NET Zero features that our customers have found highly effective to increase productivity and deliver projects faster: <br> - Multiple Architecture Options — the most advanced technologies from Microsoft <br> \- ASP.NET Core & Angular (also includes Xamarin app) <br> \- ASP.NET Core & jQuery (also includes Xamarin app) <br> \- ASP.NET MVC 5.x & jQuery <br> \- ASP.NET MVC 5.x & AngularJS <br> <br> - Built-in Multi-Tenancy (SaaS) - Advanced Authentication & Authorization - User, Role and Permission Management - Rapid Application Development (RAD tooling) - Dynamic UI Localization - Setting Management - Automated Testing - Language Management - Customizable Dashboard System - Dynamic Entity Properties - and [more](https://aspnetzero.com/Features?utm_source=volosoft&utm_medium=getappannouncement)… [Create a demo now](https://aspnetzero.com/demo?utm_source=volosoft&utm_medium=getappannouncement) to see how the UI looks like and try all ASP.NET Zero features free. *** Feel free to [contact us](mailto:info@aspnetzero.com) if you need any further information or have any questions or comments.
In this article, we will build a basic real-time messaging application in a distributed architecture. We will use [Abp Framework](https://abp.io) for infrastructure and tiered startup template, [SignalR](https://dotnet.microsoft.com/apps/aspnet/signalr) for real-time server-client communication and [RabbitMQ](https://www.rabbitmq.com/) as the distributed event bus. When Web & API tiers are separated, it is impossible to directly send a server-to-client message from the HTTP API. This is also true for a microservice architected application. We suggest using the distributed event bus to deliver the message from API application to the web application, then to the client.  Above, you can see the data-flow that we will implement in this article. This diagram represents how data will flow in our application when **Client 1** sends a message to **Client 2**. It is explained in 5 steps: 1. **Client 1** sends a message data to **Web Application** via a REST call. 2. **Web Application** redirects the message data to **Http Api**. 3. The message data is processed in **Http Api** and **Http Api** publishes an event that holds the data that will be sent to **Client 2**. 4. **Web application**, that is subscribed to that event, receives it. 5. **Web Application** sends the message to **Client 2**. For this example flow, we could send a message from **Client 1** to **Client 2** directly on the **SignalR Hub**. However, what we are trying here to demonstrate is sending a real-time message from the **Http Api** to a specific user who is connected to the web application. ## Implementation ### Startup template and the initial run [Abp Framework](www.abp.io) offers startup templates to get into the business faster. We can download a new tiered startup template using [Abp CLI](https://docs.abp.io/en/abp/latest/CLI): `abp new SignalRTieredDemo --tiered` After the download is finished, we run ***.DbMigrator** project to create the database and seed initial data (admin user, role, etc). Then we run ***.IdentityServer**, ***.HttpApi.Host** and ***.Web** to see our application working. ### Creating Application Layer We create an [application service](https://docs.abp.io/en/abp/latest/Application-Services) that publishes the message as an event. In ***.Application.Contracts** project: ````csharp using System.Threading.Tasks; using Volo.Abp.Application.Services; namespace SignalRTieredDemo { public interface IChatAppService : IApplicationService { Task SendMessageAsync(SendMessageInput input); } } ```` Input DTO for SendMessageAsync method: ````csharp namespace SignalRTieredDemo { public class SendMessageInput { public string TargetUserName { get; set; } public string Message { get; set; } } } ```` Event transfer object (ETO) for communication on event bus: ````csharp using System; namespace SignalRTieredDemo { public class ReceivedMessageEto { public string ReceivedText { get; set; } public Guid TargetUserId { get; set; } public string SenderUserName { get; set; } public ReceivedMessageEto( Guid targetUserId, string senderUserName, string receivedText) { ReceivedText = receivedText; TargetUserId = targetUserId; SenderUserName = senderUserName; } } } ```` In ***.Application** project: ````csharp using System.Threading.Tasks; using Microsoft.AspNetCore.Identity; using Volo.Abp.EventBus.Distributed; using Volo.Abp.Identity; namespace SignalRTieredDemo { public class ChatAppService: SignalRTieredDemoAppService, IChatAppService { private readonly IIdentityUserRepository _identityUserRepository; private readonly ILookupNormalizer _lookupNormalizer; private readonly IDistributedEventBus _distributedEventBus; public ChatAppService(IIdentityUserRepository identityUserRepository, ILookupNormalizer lookupNormalizer, IDistributedEventBus distributedEventBus) { _identityUserRepository = identityUserRepository; _lookupNormalizer = lookupNormalizer; _distributedEventBus = distributedEventBus; } public async Task SendMessageAsync(SendMessageInput input) { var targetId = (await _identityUserRepository.FindByNormalizedUserNameAsync(_lookupNormalizer.NormalizeName(input.TargetUserName))).Id; await _distributedEventBus.PublishAsync(new ReceivedMessageEto(targetId, CurrentUser.UserName, input.Message)); } } } ```` ### Creating API Layer We create an endpoint for sending the message that redirects the process to application layer: In **controllers** folder of ***.HttpApi** project: ````csharp using System.Threading.Tasks; using Microsoft.AspNetCore.Mvc; using Volo.Abp.AspNetCore.Mvc; namespace SignalRTieredDemo.Controllers { [Route("api/app/chat")] public class ChatController : AbpController, IChatAppService { private readonly IChatAppService _chatAppService; public ChatController(IChatAppService chatAppService) { _chatAppService = chatAppService; } [HttpPost] [Route("send-message")] public async Task SendMessageAsync(SendMessageInput input) { await _chatAppService.SendMessageAsync(input); } } } ```` ### Adding SignalR To add SignalR to our solution, we add `Volo.Abp.AspNetCore.SignalR` nuget package to ***.Web** project. And then add `AbpAspNetCoreSignalRModule` dependency: ````csharp namespace SignalRTieredDemo.Web { [DependsOn( ... typeof(AbpAspNetCoreSignalRModule) // <--- )] public class SignalRTieredDemoWebModule : AbpModule { ```` Also, we need to add [@abp/signalr](https://www.npmjs.com/package/@abp/signalr) npm package to package.json in ***.Web** project, then run **yarn** and **gulp** commands. `````json { . . "dependencies": { . . "@abp/signalr": "^2.9.0" } } ````` *Remember to add the latest package version.* You can find more information for Abp SignalR Integration on [the related document](https://docs.abp.io/en/abp/latest/SignalR-Integration). ### Creating A Hub We need a hub for SignalR connection. We can inherit it from `AbpHup` base class. In ***.Web** project: ````csharp using Microsoft.AspNetCore.Authorization; using Volo.Abp.AspNetCore.SignalR; namespace SignalRTieredDemo.Web { [Authorize] public class ChatHub : AbpHub { } } ```` While you could inherit from the standard `Hub` class, `AbpHub` has some common services pre-injected as base properties, which is useful on your development. ### Adding & Configuring RabbitMQ To add RabbitMQ to our solution, we add `Volo.Abp.EventBus.RabbitMQ` nuget package to ***.HttpApi.Host** and ***.Web** projects. Launch a **command line**, navigate to directory where ***.HttpApi.Host.csproj** file exist, and run the command below using [Abp CLI](https://docs.abp.io/en/abp/latest/CLI): ````bash abp add-package Volo.Abp.EventBus.RabbitMQ ```` Then do the same for ***.Web** project. After we add the package, we configure RabbitMQ by adding configuration in **appsettings.json** files of those projects. For ***.HttpApi.Host** project: ````json { ... "RabbitMQ": { "Connections": { "Default": { "HostName": "localhost" } }, "EventBus": { "ClientName": "SignalRTieredDemo_HttpApi", "ExchangeName": "SignalRTieredDemoTest" } }, ... } ```` For ***.Web** project: ````json { ... "RabbitMQ": { "Connections": { "Default": { "HostName": "localhost" } }, "EventBus": { "ClientName": "SignalRTieredDemo_Web", "ExchangeName": "SignalRTieredDemoTest" } }, ... } ```` ### Handling New Message Event Once we publish a new message event from `Http Api`, we must handle it in `Web Application`. Therefore we need an event handler in ***.Web** Project: ````csharp using System.Threading.Tasks; using Microsoft.AspNetCore.SignalR; using Volo.Abp.DependencyInjection; using Volo.Abp.EventBus.Distributed; namespace SignalRTieredDemo.Web { public class ReceivedMessageEventHandler : IDistributedEventHandler<ReceivedMessageEto>, ITransientDependency { private readonly IHubContext<ChatHub> _hubContext; public ReceivedMessageEventHandler(IHubContext<ChatHub> hubContext) { _hubContext = hubContext; } public async Task HandleEventAsync(ReceivedMessageEto eto) { var message = $"{eto.SenderUserName}: {eto.ReceivedText}"; await _hubContext.Clients .User(eto.TargetUserId.ToString()) .SendAsync("ReceiveMessage", message); } } } ```` ### Creating Chat Page We create the files below in **Pages** folder of ***.Web** Project. **Chat.cshtml**: ````html @page @using Volo.Abp.AspNetCore.Mvc.UI.Packages.SignalR @model SignalRTieredDemo.Web.Pages.ChatModel @section styles { <abp-style src="/Pages/Chat.css" /> } @section scripts { <abp-script type="typeof(SignalRBrowserScriptContributor)" /> <abp-script src="/Pages/Chat.js" /> } <h1>Chat</h1> <div> <abp-row> <abp-column size-md="_6"> <div>All Messages:</div> <ul id="MessageList" style=""> </ul> </abp-column> <abp-column size-md="_6"> <form> <abp-row> <abp-column> <label for="TargetUser">Target user:</label> <input type="text" id="TargetUser" /> </abp-column> </abp-row> <abp-row class="mt-2"> <abp-column> <label for="Message">Message:</label> <textarea id="Message" rows="4"></textarea> </abp-column> </abp-row> <abp-row class="mt-2"> <abp-column> <abp-button type="submit" id="SendMessageButton" button-type="Primary" size="Block" text="SEND!" /> </abp-column> </abp-row> </form> </abp-column> </abp-row> </div> ```` **Chat.cshtml.cs**: ````csharp using Microsoft.AspNetCore.Mvc.RazorPages; namespace SignalRTieredDemo.Web.Pages { public class ChatModel : PageModel { public void OnGet() { } } } ```` **Chat.css**: ````css #MessageList { border: 1px solid gray; height: 400px; overflow: auto; list-style: none; padding-left: 0; padding: 10px; } #TargetUser { width: 100%; } #Message { width: 100%; } ```` **Chat.js**: ````javascript $(function () { var connection = new signalR.HubConnectionBuilder().withUrl("/signalr-hubs/chat").build(); connection.on("ReceiveMessage", function (message) { console.log(message); $('#MessageList').append('<li><strong><i class="fas fa-long-arrow-alt-right"></i> ' + message + '</strong></li>'); }); connection.start().then(function () { }).catch(function (err) { return console.error(err.toString()); }); $('#SendMessageButton').click(function (e) { e.preventDefault(); var targetUserName = $('#TargetUser').val(); var message = $('#Message').val(); $('#Message').val(''); signalRTieredDemo.controllers.chat.sendMessage({ targetUserName: targetUserName, message: message }).then(function() { $('#MessageList') .append('<li><i class="fas fa-long-arrow-alt-left"></i> ' + abp.currentUser.userName + ': ' + message + '</li>'); }); }); }); ```` Then we can add this new page to menu on ***MenuContributor.cs** in **Menus** folder: ````csharp ... public class SignalRTieredDemoMenuContributor : IMenuContributor { ... private Task ConfigureMainMenuAsync(MenuConfigurationContext context) { ... context.Menu.Items.Add(new ApplicationMenuItem("SignalRDemo.Chat", "Chat", "/Chat")); // <-- We add this line return Task.CompletedTask; } ... } ```` ## Running & Testing We run ***.IdentityServer**, ***.HttpApi.Host** and ***.Web** in order. After ***.Web** project is ran, firstly log in with `admin` username and `1q2w3E*` password.   After we login, go to `/Identity/Users` page and create a new user. So that we can chat with them.  Then we open the application in another browser and login with the user we created above. Now we can go to the chat page and start messaging:  We can test with more users. All sent and incoming messages are displayed in the left box. ### Source code The source code of the final application can be found on the [GitHub repository](https://github.com/abpframework/abp-samples/tree/master/SignalRTieredDemo). * * * ## Read More: 1. [How to Use Attribute Directives to Avoid Repetition in Angular Templates](https://volosoft.com/blog/attribute-directives-to-avoid-repetition-in-angular-templates) 2. [ASP.NET Core 3.1 Webhook Implementation Using Pub/Sub Pattern](https://volosoft.com/blog/ASP.NET-CORE-3.1-Webhook-Implementation-Using-Pub-Sub) 3. [Why You Should Prefer Singleton Pattern over a Static Class?](https://volosoft.com/blog/Prefer-Singleton-Pattern-over-Static-Class)
<div style="font-size:small;margin:-1em 0 2em;text-align:center">Photo by <a href="https://unsplash.com/@lucian_alexe?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Lucian Alexe</a> on <a href="/s/photos/usb-c?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a></div> Lately, we have started working on v3.0 of [ABP Framework](https://abp.io/) and our ultimate goal as the frontend team is to introduce solutions that improve developer experience. In this context, one of the things we are working on is migrating the current grid in Angular to the feature-rich [ngx-datatable](https://swimlane.github.io/ngx-datatable/). Nevertheless, when we tried to implement it in the project, we have realized that the amount of code duplicated in each CRUD page was troublesome. ```html <ngx-datatable [rows]="data$ | async" [count]="totalCount$ | async" [loadingIndicator]="list.isLoading$ | async" [limit]="list.maxResultCount" [offset]="list.page" (page)="list.page = $event.offset" (sort)="sort($event)" [externalPaging]="true" [externalSorting]="true" [headerHeight]="50" [footerHeight]="50" rowHeight="auto" columnMode="force" class="material" > <!-- templates here --> </ngx-datatable> ``` > We have a ListService which makes it easier to work with remote pagination and sorting. It is a core feature and we are planning to keep it as UI independent as possible. Some properties of ngx-datatable fit really well, while others, sorting specifically, do not. Nothing is wrong with ngx-datatable. It actually is an amazing work and probably one of the best grids you can use in Angular. And, to be fair, all of the bindings above are for rendering the content properly, so they are not useless after all. Still, from a developer experience perspective, this is painful. Here is how we see it: - CRUD pages in the community version of ABP would require this code to be copied manually over and over. We should avoid this somehow. - Although there is [a nice code generator](https://commercial.abp.io/tools/suite) for ABP Commercial users, readability and maintenance of the generated code is an important aspect. Less is more. Naturally, we started looking for a way to reduce the amount of code that will be necessary each time ngx-datatable is consumed. ## Attribute Directives to the Rescue The initial idea was to handle property and event bindings between the grid and the [ListService](https://docs.abp.io/en/abp/latest/UI/Angular/List-Service) instance, so we started worked on an attribute directive that works as an adapter. Later, we removed all appearance-related properties too. The following is what we came up with in the end: ```html <ngx-datatable [rows]="data$ | async" [count]="totalCount$ | async" [list]="list" default > <!-- templates here --> </ngx-datatable> ``` Sweet, right? Thanks to two attribute directives, we now have much less code to worry about and a better focus on what really matters. The first directive, which has `ngx-datatable[list]` as selector, provides a single point of communication between the `DatatableComponent` and the `ListService`. The second directive, `ngx-datatable[default]`, eliminates the noise created by property bindings just to make ngx-datatable styles match our project. We could have built only one directive, but followed single responsibility principle and ended up creating one for appearance and another for functionality. Our intention is to grant ABP developers the flexibility to remove default appearance when they want to implement their own styles. ## The Adapter Directive I am not planning to show you the implementation details of the actual ListService. All you need to know is, it is built based on ABP backend and is UI independent. I will, however, describe what we needed to bind from it to the grid, and vice versa. First, let us take a look at the directive code: ```js @Directive({ // tslint:disable-next-line selector: 'ngx-datatable[list]', exportAs: 'ngxDatatableList', }) export class NgxDatatableListDirective implements OnChanges, OnDestroy, OnInit { private subscription = new Subscription(); @Input() list: ListService; constructor(private table: DatatableComponent, private cdRef: ChangeDetectorRef) { this.table.externalPaging = true; this.table.externalSorting = true; } private subscribeToPage() { const sub = this.table.page.subscribe(({ offset }) => { this.list.page = offset; this.table.offset = offset; }); this.subscription.add(sub); } private subscribeToSort() { const sub = this.table.sort.subscribe(({ sorts: [{ prop, dir }] }) => { this.list.sortKey = prop; this.list.sortOrder = dir; }); this.subscription.add(sub); } private subscribeToIsLoading() { const sub = this.list.isLoading$.subscribe(loading => { this.table.loadingIndicator = loading; this.cdRef.markForCheck(); }); this.subscription.add(sub); } ngOnChanges({ list }: SimpleChanges) { if (!list.firstChange) return; const { maxResultCount, page } = list.currentValue; this.table.limit = maxResultCount; this.table.offset = page; } ngOnDestroy() { this.subscription.unsubscribe(); } ngOnInit() { this.subscribeToPage(); this.subscribeToSort(); this.subscribeToIsLoading(); } } ``` Here is what every property and method does: - `list` is for binding the `ListService` instance. - `table` is the `DatatableComponent` instance, retrieved from the dependency injection system. - `cdRef` is the `ChangeDetectorRef` instance, through which we can mark the host for change detection. - `ngOnChanges` is sets `limit` and `offset ` properties of the table at first change. - `ngOnInit` initializes subscriptions to `page` and `sort` events of the grid, as well as the `isLoading$` of the service. - `subscribeToPage`, `subscribeToSort`, and `subscribeToIsLoading` are mapping observable properties at both end. - `subscription` is for collecting RxJS subscriptions, which are later unsubscribed from at `ngOnDestroy` lifecycle hook. The key takeaway in this is the fact that an attribute directive in Angular can obtain the host instance, hook into its events, manipulate its properties, and even execute change detection manually on it when necessary. The possibilities are endless. You can adapt a component to any interface and avoid the performance penalty of the default change detection strategy, if you like. Another important aspect is the directive selector. The selector queries only `ngx-datatable` elements with a `list` attribute, effectively leaving the datatables without the attribute or other elements with a list property alone. You will probably want to place `// tslint:disable-next-line` above the selector though, because the linters are usually configured to throw an error when directive selectors do not start with the app or library prefix. ## The Default Properties Directive Creating an Angular directive for default properties is much easier compared to an adapter directive and, in consequence, probably less exciting. It is quite advantageous though. This is what it looks like: ```js @Directive({ // tslint:disable-next-line selector: 'ngx-datatable[default]', exportAs: 'ngxDatatableDefault', }) export class NgxDatatableDefaultDirective { @Input() class = 'material bordered'; @HostBinding('class') get classes(): string { return `ngx-datatable ${this.class}`; } constructor(private table: DatatableComponent) { this.table.columnMode = ColumnMode.force; this.table.footerHeight = 50; this.table.headerHeight = 50; this.table.rowHeight = 'auto'; } } ``` The `DatatableComponent` instance is again retrieved from Angular's dependency injection system. Then, several properties are set within the constructor, before any property binding to that instance occurs. Thus, the defaults defined by the `DatatableComponent` class are effectively overridden. The only problematic property is `class` here and that is due to [this implementation of host binding](https://github.com/swimlane/ngx-datatable/blob/master/projects/swimlane/ngx-datatable/src/lib/components/datatable.component.ts#L56) in ngx-datatable. The workaround is taking `class` as an input property and binding its value to host element after concatenation with `ngx-datatable` class. When no class is given, a material theme is applied and some custom borders are added. Setting another class through the template is still available, so the interface is unchanged. Finally, it would work perfectly fine if we referred to just `ngx-datatable` as the selector, but we want to give the developer the opportunity to drop this directive altogether, so we did not. ## Conclusion People tend to adapt their components to UI kits and libraries, but UI may be altered in time. Core API decisions and business logic, on the other hand, do not change as frequently. I believe, we should adapt UI elements to our components and services, and not the other way around. Some libraries offer injectable adapters for this purpose, which is nice, because they can be employed in order to adjust the behavior. Most, though, do not. As you see, adapter attribute directives prove handy in such cases. Placing same properties repeatedly in templates is also troubling. I have always thought this as a weak point of HTML. Now we have frontend frameworks and we keep doing the same. Angular provides us tools to avoid this and an attribute directive for default properties makes perfect sense in my opinion. Check your own project. You will surely find a good use for them. I have created [a StackBlitz playground](https://stackblitz.com/edit/angular-adapter-directives?file=src/app/app.component.ts) with a dummy backend and a simplified list service. You can see both directives in action. Thanks for reading. Have a nice day. * * * ## Read More: 1. [Real-Time Messaging In A Distributed Architecture Using ABP, SignalR & RabbitMQ](https://volosoft.com/blog/RealTime-Messaging-Distributed-Architecture-Abp-SingalR-RabbitMQ) 2. [ASP.NET Core 3.1 Webhook Implementation Using Pub/Sub Pattern](https://volosoft.com/blog/ASP.NET-CORE-3.1-Webhook-Implementation-Using-Pub-Sub) 3. [Why You Should Prefer Singleton Pattern over a Static Class?](https://volosoft.com/blog/Prefer-Singleton-Pattern-over-Static-Class)
In this article, I will show you the basics of the Webhook mechanism that uses a publish-subscribe pattern in the ASP.NET CORE 3.1 project. ## What is Webhook Webhooks are **user-defined HTTP callbacks**. They are usually triggered by some events, such as pushing code to a repository or a comment being posted to a blog. When that event occurs, the source site makes an HTTP request to the URL configured for the Webhook. Users can configure them to cause events on one site to invoke behavior on another. *(*[*https://en.wikipedia.org/wiki/Webhook*](https://en.wikipedia.org/wiki/Webhook)*)* We have implemented a Webhook mechanism that uses a publish-subscribe pattern to our [ASP.NET Boilerplate](https://aspnetboilerplate.com/) and [ASP.NET ZERO](https://aspnetzero.com/) projects. Users can subscribe to a Webhook event(s) (for example `User.Created` , `Notification.NewNotification` etc…) then when that event occurs application publishes a Webhook(s) to subscribed endpoints. ## Implementation **Note:** In this article, I will not show any basic CRUD operation code. I will just share the parts that I think are important. If you want all codes and more you can check* [*ASP.NET Boilerplate github page*](https://github.com/aspnetboilerplate/aspnetboilerplate/tree/master/src/Abp/Webhooks)*.* **WebhookSubscription:** An entity that we will store Webhook subscriptions. <script src="https://gist.github.com/demirmusa/fa060c320b7c37a341978ca427513f5d.js"></script> The important thing is the subscription’s secret. You should set it when you create a Webhook subscription and never change it. (Unless your custom logic contains something like change Webhook secret) <script src="https://gist.github.com/demirmusa/814cb517e8f90e128ee60fb9791748ed.js"></script> We will sign our Webhook payload using that secret right before sending it to the client. Then when the client receives the Webhook, it will check if the Webhook payload has been signed correctly. The client will be able to verify that the data comes from the correct source. So, the secret is important. You can manage it with basic CRUD operations (example: [*WebhookSubscriptionManager*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp/Webhooks/WebhookSubscriptionManager.cs)) **WebhookEvent**: An entity that we will store Webhook information. Webhook events can be sent to a large number of subscriptions. Since we send Webhook in separate background job, we need to store Webhook data to use it again and again. <script src="https://gist.github.com/demirmusa/cad749c7aae54764303f944891d4a643.js"></script> You can manage it with basic CRUD operations (example: [*WebhookEventStore*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp.Zero.Common/Webhooks/WebhookEventStore.cs)) **WebhookSendAttempt:** An entity that we will store all information about Webhook send process. <script src="https://gist.github.com/demirmusa/9a2b65214981d6f7a3108f3d02832427.js"></script> You can manage it with basic CRUD operations (example: [*WebhookSendAttemptStore*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp.Zero.Common/Webhooks/WebhookSendAttemptStore.cs)) **WebhookDefinition:** Definitions of Webhook that system has. <script src="https://gist.github.com/demirmusa/2e488b1c76d32bb7122712985e253de3.js"></script> You can manage it with basic CRUD operations (example: [*WebhookDefinitionManager*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp/Webhooks/WebhookDefinitionManager.cs)) **WebhookPayload:** Webhook payload pattern that we use to send our Webhooks. <script src="https://gist.github.com/demirmusa/9b4a6a9663dd42e6b9a6142e6104a01a.js"></script> **WebhookPublisher** The class that we will use to publish Webhooks to all subscriptions. It is responsible to check subscription and create a background job which sends Webhooks. *Publishing workflow:*  *Code:* <script src="https://gist.github.com/demirmusa/7d6ada155b0a2787fc7c56c7a094872b.js"></script> *(You can check* [*github.com/aspnetboilerplate/DefaultWebhookPublisher*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp/Webhooks/DefaultWebhookPublisher.cs) *for more information)* **WebhookSenderJob** *Workflow:*  *Code:* <script src="https://gist.github.com/demirmusa/4006e83b63352c4c2d0fe9e59d3cd92c.js"></script> *(You can check* [*github.com/aspnetboilerplate/DefaultWebhookPublisher*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/dev/src/Abp/Webhooks/DefaultWebhookPublisher.cs) *for more information)* [**WebhookSender**](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp/Webhooks/DefaultWebhookSender.cs) *Workflow:*  *Code:* *SendWebhook:* <script src="https://gist.github.com/demirmusa/ccc09ccdf5badf6bb518792ad7d9afbf.js"></script> *GetSerializedBodyAsync:* <script src="https://gist.github.com/demirmusa/22eca4a0fba7041dd30fd9ee9114c387.js"></script> *SignWebhookRequest:* <script src="https://gist.github.com/demirmusa/8f79197a1d3c00eef49b9e69dcd92b82.js"></script> *SendHttpRequest:* <script src="https://gist.github.com/demirmusa/ef6e37afb4dbe50df54a98b80452cb5f.js"></script> ***Note:\*** *If you use ASP.Net Core, using* ***IHttpClientFactory\*** *is recommend way to create HttpClient. For example you can check* [*github.com/aspnetboilerplate/AspNetCoreWebhookSender*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp.AspNetCore/AspNetCore/Webhook/AspNetCoreWebhookSender.cs) *(You can check the rest of the code and see all together in:* [*github.com/aspnetboilerplate/DefaultWebhookSender*](https://github.com/aspnetboilerplate/aspnetboilerplate/blob/master/src/Abp/Webhooks/DefaultWebhookSender.cs)*.)* All done! Now we can use Webhook publisher. # **Testing Our Code** **On Server-side** *Add a subscription:* <script src="https://gist.github.com/demirmusa/d43c56e502a34bceda3a82f5459334fb.js"></script> *Send Webhook:* <script src="https://gist.github.com/demirmusa/89540cb8e0248e1426e3c354fe0793a8.js"></script> Publishing is done. It will be sent to all subscribed clients. **C# client-side example:\** <script src="https://gist.github.com/demirmusa/7911631546e0848859a5b08c33870eb4.js"></script> For more information you can check; [https://aspnetboilerplate.com/Pages/Documents/Webhook-System](https://aspnetboilerplate.com/Pages/Documents/Webhook-System) [https://github.com/aspnetboilerplate/aspnetboilerplate](https://github.com/aspnetboilerplate/aspnetboilerplate) * * * ## Read More: 1. [Real-Time Messaging In A Distributed Architecture Using ABP, SignalR & RabbitMQ](https://volosoft.com/blog/RealTime-Messaging-Distributed-Architecture-Abp-SingalR-RabbitMQ) 2. [Why You Should Prefer Singleton Pattern over a Static Class?](https://volosoft.com/blog/Prefer-Singleton-Pattern-over-Static-Class) 3. [How to Use Attribute Directives to Avoid Repetition in Angular Templates](https://volosoft.com/blog/attribute-directives-to-avoid-repetition-in-angular-templates)
**Azure Key Vault** is a cloud service that provides a secure store for secrets. You can securely store keys, passwords, certificates, and other secrets. For more information about Azure Key Vault, please refer to its [documentation](https://docs.microsoft.com/en-us/aspnet/core/security/key-vault-configuration). Azure Key Vault provides two methods, **Certificate** and **Managed**. We will use the Certificate method in our sample. Azure Key Vault service is suitable for use in production but in some cases, developers might want to access Azure Key Vault from the development environment. I will explain how to access Azure Key Vault from an ASP.NET Core application which runs on the local development environment. In order to use the Azure Key Vault, you must have an Azure account. If you don’t have one, you can create a free account on https://azure.microsoft.com/. # Register Azure AD Application As the first step, we will register a new app on Azure Azure Active Directory. Go to https://portal.azure.com/ and then go to the Azure Active Directory section and then “App Registrations” section. Click “New Registration” and register our app. I will name my app in this sample as “**abp.io-vault**”.  # Upload Local Certificate to Azure Since we are going to use Certificate method to connect to our Key Vault, we must upload our localhost certificate for the app we have registered. ASP.NET Core creates a certificate for development purposes. In order to upload our local development certificate, we must first export it. To do that, press **Windows+R** and run **certmgr.msc**. This will open Windows certificate manager. > *If you are using another Operating System, you can search how to export a local certificate for your Operating System on Google.* In Certificate Manager, navigate to “Trusted Root Certification Authorities” and click certificates. Search for “localhost” and select the one with the “IIS Express Development Certificate” friendly name.  Right-click selected certificate and select “All Tasks > Export…” to export the certificate. You must select exporting the **private key** to use the exported certificate on Azure. Save the file to a location on your computer. We will upload an exported file to Azure later. # Create Azure Key Vault In order to create an Azure Key Vault, go to https://portal.azure.com/, search for “Key vaults” and navigate to key vaults directory. Then, click **Add** to create a key vault. Fill the form and create your key vault storage.  At this step, you can go to “Access policy” tab, create a new policy. You must select the application we have registered before as the principal for the access policy here. I have selected “**abp.io-vault**”. Then, create the key vault.  After creating the key vault, go to detail of created key vault and create a new secret named **VerySecretValue**. We will use this value for our test. # Upload Certificate Before using the key vault we have created, there is one more step. We will upload the certificate we have exported before to Azure. Go to Azure Active Directory and then app registrations. Select the app you have registered before. Go to “Certificates & secrets” and upload the certificate we have exported before using “Upload certificate” button. This will generate a Thumbprint value and we will use it in our ASP.NET Core app. # Test application We will create a new ASP.NET Core application to test this. I created a new project using [ABP.IO](https://abp.io/get-started) using the command below. ````bash abp new AzureKeyVaultSample ```` In order to use Azure Key Vault, we must add [Microsoft.Extensions.Configuration.AzureKeyVault](https://www.nuget.org/packages/Microsoft.Extensions.Configuration.AzureKeyVault/) NuGet package to our project. I have added it to my *.Web project. You can add it to the project which contains Program.cs if you haven’t created your project using [ABP.IO](https://abp.io/get-started). After adding the NuGet package, let’s configure our web application to read values from Azure Key Vault. Add below statement to CreateHostBuilder in Program.cs. ````csharp CreateDefaultBuilder // your existing configurations .ConfigureAppConfiguration((context, config) => { using (var store = new X509Store(StoreLocation.CurrentUser)) { store.Open(OpenFlags.ReadOnly); var certs = store.Certificates .Find(X509FindType.FindByThumbprint, "Thumbprint", false); config.AddAzureKeyVault( $"https://{KeyVaultName}.vault.azure.net/", "ApplicationId", certs.OfType<X509Certificate2>().Single()); store.Close(); } }) ```` - `Thumbprint`: We retrieved this value when we upload the certificate before. - `KeyVaultName`: This is the name of the key vault we have created. - `ApplicationId`: This is the Id of the application we have registered for Azure AD. Now, our ASP.NET Core app is ready to read values from Azure Key Vault from our local development environment. Let’s change the source code of Index Page to retrieve secret value from Azure Key Vault. ````csharp public class IndexModel : AzureKeyVaultSamplePageModel { private readonly IConfiguration _configuration; public IndexModel(IConfiguration configuration) { _configuration = configuration; } public string VerySecretValue { get; set; } public void OnGet() { VerySecretValue = _configuration.GetValue<string>("VerySecretValue"); } } ```` We just injected `IConfiguration` to our Page and retrieved the value we have entered before and set it to public property. We can then show this value on our Page. ````html <div> <!-- Other code blocks --> <p> This is a very secret value: @Model.VerySecretValue </p> </div> ```` Now, we can run the project and navigate to the Index page and see the retrieved value from the Azure Key Vault.
## Introduction After a long while, I had to create [a singleton service](https://github.com/abpframework/abp/blob/dev/framework/src/Volo.Abp.ObjectExtending/Volo/Abp/ObjectExtending/ObjectExtensionManager.cs) while I'm developing [a feature](https://docs.abp.io/en/abp/latest/Object-Extensions) for the [ABP Framework](https://abp.io/). Then I decided to write an article about that: Why and how we should use the singleton pattern over a static class? OK, I agree that **creating singleton class or static class, doesn't matter, is not a good practice**. However, in practical, it can be unavoidable in some points. So, which one we should prefer? While the article title already answers this question, I will explain details of this decision in this article. While this article uses **C#** as the programming language, the principles can be applied in any object oriented language. > You can get the **source code** from [my GitHub samples repository](https://github.com/hikalkan/samples/tree/master/SingletonVsStatic). ## Using Dependency Injection? Then you are lucky and you are doing a good thing, you can use the **singleton lifetime**. ASP.NET Core allows you to register a class with the singleton lifetime. Assuming that you've a `MyCache` class and you want to register it with the singleton lifetime. Write this inside the `ConfigureServices` method of your `Startup class` and that's all: ````csharp services.AddSingleton<MyCache>(); ```` You still need to care about **multi-threading**, you **shouldn't inject transient/scoped services** into your singleton service (see my [dependency injection best practices guide](https://volosoft.com/blog/ASP.NET-Core-Dependency-Injection-Best-Practices,-Tips-Tricks) for more), but you don't need to manually implement the singleton pattern. ASP.NET Core Dependency Injection system handles it. Whenever you need to the `MyCache` service, just inject it like any other service. However, there can be some reasons to **manually implement the singleton pattern** even if you use the dependency injection: * [ASP.NET Core Dependency Injection](https://docs.microsoft.com/en-us/aspnet/core/fundamentals/dependency-injection) system doesn't allow to use services **until the service registration phase completes**. If you need to use your service before or inside the `ConfigureServices` then you can not get benefit of the dependency injection. * You can't inject a service from a **static context** where you don't have access to the `IServiceProvider`. For example, dependency injection may not be usable in an [extension method](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/classes-and-structs/extension-methods). ## Singleton Pattern vs Static Class There are two main reasons that you should prefer singleton pattern over a static class. I will introduce briefly, then I will explain them in the next sections by details. ### Testability Singletons are **well testable** while a static class may not; * If your class stores state (data), running **multiple tests might effect each other**, so writing test will be harder. * Static classes are hard or impossible to **mock**. So, if you are testing a class depends on the static class, mocking may not be an easy option. ### Extensibility * It is not possible to **inherit** from a static class, while it is possible with singleton pattern if you want to allow it. So, anyone can inherit from a singleton class, override a method and replace the service. * It is not possible to write an [extension method](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/classes-and-structs/extension-methods) to a static class while it is possible for a singleton object. ## The Solution Overview I created a solution that implements two caching services, one with the singleton pattern, the other one is a static class. I also created unit tests for both of the services:  You can get the source code from [my GitHub samples repository](https://github.com/hikalkan/samples/tree/master/SingletonVsStatic). The rest of the article will be based on this solution. ## Implementing a Singleton Service Let's see a simple caching class that is implemented via the singleton pattern: ````csharp public class SingletonCache { public static SingletonCache Instance { get; protected set; } = new SingletonCache(); private readonly IDictionary<string, object> _cacheDictionary; protected internal SingletonCache() { _cacheDictionary = new Dictionary<string, object>(); } public virtual void Add(string key, object value) { lock (_cacheDictionary) { _cacheDictionary[key] = value; } } public virtual object GetOrNull(string key) { lock (_cacheDictionary) { if (_cacheDictionary.TryGetValue(key, out object value)) { return value; } return null; } } public virtual object GetOrAdd(string key, Func<object> factory) { lock (_cacheDictionary) { var value = GetOrNull(key); if (value != null) { return value; } value = factory(); Add(key, value); return value; } } public virtual void Clear() { lock (_cacheDictionary) { _cacheDictionary.Clear(); } } public virtual bool Remove(string key) { lock (_cacheDictionary) { return _cacheDictionary.Remove(key); } } public virtual int GetCount() { lock (_cacheDictionary) { return _cacheDictionary.Count; } } } ```` * The static `Instance` property is the object that should be used by other classes, like `SingletonCache.Instance.Add(...)` to add a new item to the cache. * I marked the setter as `protected set` to make it settable/replaceable only by a derived class. * The `_cacheDictionary` is not static because the object (`Instance`) is already static. * I declared `protected internal` for the constructor because; * `protected` makes possible to inherit from this class. * `internal` makes possible to create an instance of this class from the same assembly or an allowed assembly. I allowed to the `SingletonVsStatic.SingletonLib.Tests` project, so it can create an instance to test it (I used `InternalsVisibleTo` attribute in the `Properties/AssemblyInfo.cs` of the `SingletonVsStatic.SingletonLib` project to make it possible). * I make all methods `virtual`, so a derived class can override. ### Extending the SingletonCache #### Creating an Extension Method Anyone can write an extension method for the `SingletonCache` class. Example: You found that you are frequently repeating yourself for such a code block: ````csharp var myValue = SingletonCache.Instance.GetOrNull("MyKey"); if (myValue == null) { throw new ApplicationException("MyKey was not present in the cache!"); } //...continue to your logic if it is not null ```` It would be better that you have a `GetOrThrowException` method that throws exception if given key is not present: ````csharp var myValue = SingletonCache.Instance.GetOrThrowException("MyKey"); //...continue to your logic if it is not null ```` You can easily create such an extension method to implement it yourself: ````csharp public static class MySingletonCacheExtensions { public static object GetOrThrowException( this SingletonCache singletonCache, string key) { var value = singletonCache.GetOrNull(key); if (value == null) { throw new ApplicationException( "Given key was not present in the cache: " + key); } return value; } } ```` That's not possible if it was a static class. #### Deriving from the SingletonCache If `SingletonCache` is in a library that you are using, you can't change it source code. But you can inherit and override a method if you like. Example: You see that the `Add` method allows to add `null` values to the cache, which may not a good thing for your application. You can inherit from the `SingletonCache` and override the `Add` method: ````csharp public class MySingletonCache : SingletonCache { public static void Replace() { SingletonCache.Instance = new MySingletonCache(); } public override void Add(string key, object value) { if (value == null) { throw new ArgumentNullException(nameof(value)); } base.Add(key, value); } } ```` Then you need to call `MySingletonCache.Replace()` method at the beginning of your application to replace the `SingletonCache.Instance` with your own implementation. ### Testing the SingletonCache Testing the `SingletonCache` is easy. I used [xUnit](https://xunit.net/) for tests. xUnit create **a new instance of the test class** for each test method, so I don't care if tests effect each other: Every test uses a different cache object which is initialized to the same initial state (added two sample values in the constructor). I can even **run all tests in parallel**, no problem! ````csharp public class SingletonCache_Tests { private readonly SingletonCache _singletonCache; public SingletonCache_Tests() { _singletonCache = new SingletonCache(); _singletonCache.Add("TestKey1", "TestValue1"); _singletonCache.Add("TestKey2", "TestValue2"); } [Fact] public void Should_Contain_Initial_Values() { Assert.Equal(2, _singletonCache.GetCount()); Assert.Equal("TestValue1", _singletonCache.GetOrNull("TestKey1")); Assert.Equal("TestValue2", _singletonCache.GetOrNull("TestKey2")); } [Fact] public void Should_Add_And_Get_Values() { _singletonCache.Add("MyNumber", 42); Assert.Equal(42, _singletonCache.GetOrNull("MyNumber")); } [Fact] public void Should_Increase_Count_When_A_New_Item_Added() { Assert.Equal(2, _singletonCache.GetCount()); _singletonCache.Add("TestKeyX", "X"); Assert.Equal(3, _singletonCache.GetCount()); } [Fact] public void Clear_Should_Delete_All_Values() { _singletonCache.Clear(); Assert.Equal(0, _singletonCache.GetCount()); Assert.Null(_singletonCache.GetOrNull("TestKey1")); } [Fact] public void Should_Remove_Values() { _singletonCache.Remove("TestKey1"); Assert.Null(_singletonCache.GetOrNull("TestKey1")); } [Fact] public void Should_Use_Factory_Only_If_The_Value_Was_Not_Present() { //The key is already present, so it doesn't use the factory to create a new one Assert.Equal("TestValue1", _singletonCache.GetOrAdd("TestKey1", () => "TestValue1_Changed")); _singletonCache.Remove("TestKey1"); //The key is not present, so it uses the factory to create a new one Assert.Equal("TestValue1_Changed", _singletonCache.GetOrAdd("TestKey1", () => "TestValue1_Changed")); } [Fact] public void GetOrThrowException_Should_Throw_Exception_For_Unknown_Keys() { Assert.Throws<ApplicationException>(() => { _singletonCache.GetOrThrowException("UnknownKey"); }); } } ```` ### Implementing an Interface? If you are developing a **reusable class library**, it would be a good idea to create an `ISingletonCache` interface that is implemented by the `SingletonCache` class. In this way, library users can replace your service without requiring to inherit from it, they can just **re-implement** the interface. ## Implementing a Static Class The same singleton class could be implemented as a static class like shown below: ````csharp public static class StaticCache { private static readonly IDictionary<string, object> _cacheDictionary; static StaticCache() { _cacheDictionary = new Dictionary<string, object>(); } public static void Add(string key, object value) { lock (_cacheDictionary) { _cacheDictionary[key] = value; } } public static object GetOrNull(string key) { lock (_cacheDictionary) { if (_cacheDictionary.TryGetValue(key, out object value)) { return value; } return null; } } public static object GetOrAdd(string key, Func<object> factory) { lock (_cacheDictionary) { var value = GetOrNull(key); if (value != null) { return value; } value = factory(); Add(key, value); return value; } } public static void Clear() { lock (_cacheDictionary) { _cacheDictionary.Clear(); } } public static bool Remove(string key) { lock (_cacheDictionary) { return _cacheDictionary.Remove(key); } } public static int GetCount() { lock (_cacheDictionary) { return _cacheDictionary.Count; } } } ```` ### Extending the StaticCache * No way to inherit and override a method. * No way to create an extension method. ### Testing the StaticCache Testing the `StaticCache` is possible. However, **your tests may easily effect each other** since they all will be executed in the same object and you can not control the execution order of the tests. I've created a test class, `StaticCache_Tests`, inside the solution. I will partially share it here (you can see the [full source code](https://github.com/hikalkan/samples/blob/master/SingletonVsStatic/SingletonVsStatic.StaticLib.Tests/StaticCache_Tests.cs)): ````csharp public class StaticCache_Tests { static StaticCache_Tests() { StaticCache.Add("TestKey1", "TestValue1"); StaticCache.Add("TestKey2", "TestValue2"); } [Fact] public void Should_Contain_Initial_Values() { Assert.Equal(2, StaticCache.GetCount()); Assert.Equal("TestValue1", StaticCache.GetOrNull("TestKey1")); Assert.Equal("TestValue2", StaticCache.GetOrNull("TestKey2")); } [Fact] public void Should_Add_And_Get_Values() { StaticCache.Add("MyNumber", 42); Assert.Equal(42, StaticCache.GetOrNull("MyNumber")); } [Fact] public void Should_Increase_Count_When_A_New_Item_Added() { Assert.Equal(2, StaticCache.GetCount()); StaticCache.Add("TestKeyX", "X"); Assert.Equal(3, StaticCache.GetCount()); } //...other tests } ```` When I run all these tests together (parallel or not, doesn't matter since I can't control the execution order) **some tests randomly fail**:  When I run one of the failing tests (without running others) it passes. But they can't properly work together. Even if I would be able to control the execution order of the test methods, I must consider the result of the previous tests when I write a new test method. That means any change in a test method may effect next test methods. Also, this makes impossible to run all tests in parallel which is very important to reduce the total test time. ## About Multi-Threading Use singleton or static class, doesn't matter, you need to **care about multi-threading** since multiple threads (requests in web application) may use your service concurrently. Using `lock` statements while accessing the shared resources is one way to overcome this problem. You can use other techniques based on your performance and functional requirements. For example, using a [ReaderWriterLockSlim](https://docs.microsoft.com/en-us/dotnet/api/system.threading.readerwriterlockslim) would have a better performance for such a cache class. ## Conclusion * Use Dependency Injection with **singleton lifetime** as a best practice wherever possible. * Use singleton pattern rather than static classes. Using a singleton has **no additional cost** for you while it has important advantages.
Design patterns are proven, practical, and reusable solutions fit for tackling specific problems in software development. They not only help us avoid pitfalls in organizing our applications, but also provide a shared glossary to describe our implementation and understand each other as fellow developers. There are dozens of patterns already discovered and I am pretty sure you are using at least some of them, even if you do not identify them as a design pattern (Hello constructor pattern 👋). Design patterns are grouped into three categories: Creational, structural, and behavioral. In this article, I would like to focus on one of the behavioral patterns, the strategy (a.k.a. policy) pattern, and how we benefit from it in [ABP Framework](https://abp.io) frontend. I am hoping this article will help you understand and use both the pattern and ABP features more effectively than ever. ## What is Strategy Pattern? I like explaining concepts with code examples and, since we shall see the use of strategy pattern in Angular later, the code examples here are in TypeScript. That being said, JavaScript implementation is quite similar. So, let's check out what the Avengers would look like, if they were represented by a class: ```ts class Hero { constructor(public name: string, public weapon?: string) {} } class Avengers { private ensemble: Hero[] = []; private blast(hero: Hero) { console.log(`${hero.name} blasted ${hero.weapon}`); } private kickAndPunch(hero: Hero) { console.log(`${hero.name} kicked and punched`); } private shoot(hero: Hero) { console.log(`${hero.name} shot ${hero.weapon}`); } private throw(hero: Hero) { console.log(`${hero.name} threw ${hero.weapon}`); } recruit(hero: Hero) { this.ensemble = this.ensemble .filter(({name}) => name === hero.name) .concat(hero); } fight() { this.ensemble.forEach(hero => this.attack(hero)); } attack(hero: Hero) { switch(hero.name) { case 'Iron Man': this.blast(hero); break; case 'Captain America': case 'Thor': this.throw(hero); break; case 'The Hulk': this.kickAndPunch(hero); break; case 'Black Widow': hero.weapon ? this.shoot(hero) : this.kickAndPunch(hero); break; case 'Hawkeye': this.shoot(hero); break; default: console.warn('Unknown Avenger: ' + hero.name); } } } ``` Although it looks OK at first, this class has the following drawbacks: - It is difficult to `recruit` a new `Hero` to `fight` with other `Avengers`, because you will need to add another case (and probably a new attack type) for the new hero. - It is also difficult to change or remove an existing `Hero`. Consider changing Thor's attack from throwing his hammer, Mjolnir, to summoning a thunder strike. - Each different attack is just a one-liner here, but consider how difficult to read and maintain `Avengers` would become, if they were longer and more complex. - `Avengers` has to provide an attack for all heroes, although some of them might not be currently in the `ensemble`. This is not tree-shakable and could lead to a waste of resources. Strategy design pattern decouples context from an interchangeable algorithm's implementation by delegating it to another class which is bound by a contract defined via the strategy interface.  So, let's refactor `Avengers` and see that it looks like when strategy pattern is applied: ```ts abstract class Hero { constructor(public name: string, public weapon?: string) {} abstract attack(): void; } class BlastingHero extends Hero { attack() { console.log(`${this.name} blasted ${this.weapon}.`); } } class ShootingHero extends Hero { attack() { console.log(`${this.name} shot ${this.weapon}.`); } } class ThrowingHero extends Hero { attack() { console.log(`${this.name} threw ${this.weapon}.`); } } class UnarmedHero extends Hero { attack() { console.log(`${this.name} kicked and punched.`); } } class Avengers { private ensemble: Hero[] = []; recruit(hero: Hero) { this.ensemble = this.ensemble .filter(({name}) => name === hero.name) .concat(hero); } fight() { this.ensemble.forEach(hero => hero.attack()); } } ``` > Instead of creating an interface and implementing it, we have benefited from inheritance here to avoid repetition, but they have the same effect with regard to the strategy pattern. Let's check the organization of the new code: - The `Hero` abstract class provides us the **strategy**, the contract that guarantees the algorithm is implemented by each hero. We could have used an interface, but an abstract class is beneficial here. - We need **concrete strategies** which implement the algorithm. In this case, they are the subclasses. Heroes will be instances of these subclasses and they all will have a separate `attack`. - The **context** will refer to the method guaranteed by the contract when `fight` is called. ## Advantages of Strategy Pattern There are some advantages we gained by implementing the strategy pattern above: 1. The `switch` statement is gone. We no longer need to check a condition to determine what to do next. 2. It is much easier to understand and maintain `Avengers` now. 3. It is also easier to test `Avengers` than it was before. 4. `Avengers` can `recruit` any new `Hero` (Spider-Man, Ant-Man, Scarlet Witch, Falcon, etc.) and their `attack` will just work. Therefore, another concrete strategy can always be introduced and the functionality is much more extensible. 5. `Avengers` is now able to switch between available strategies at runtime. In other words, it is now capable of replacing an `UnarmedHero` with a `ShootingHero`. Think about Black Widow who can be both. 6. If a concrete strategy, a subclass of `Hero`, is not used, it could possibly be tree-shaked. ## Drawbacks of Strategy Pattern There are but a few drawbacks which could be associated with strategy pattern: 1. The client (consumer of `Avengers`, Nick Furry?) should be aware of the available strategies in order to `recruit` the correct one. 2. There may be some peaceful `Hero` who will not `attack` at all, but still has to implement a noop method, just to align with the contract. Think about Bruce Banner (not the Hulk), who can contribute to the `Avengers` with his science and not participate in the `fight`. 3. There is an increased number of objects generated. In some edge cases, this can cause an overhead. There is another design pattern, the flyweight pattern, that can be used to lower this overhead. 4. There are cases when parameters are passed to methods defined by the strategy. Not all strategies use all parameters, but they still have to be generated and passed. ## How ABP Benefits From the Strategy Pattern Several services in ABP Angular packages resort to the strategy pattern and we are planning to refactor more of the existing ones into this pattern. Let's see how `DomInsertionService` is used: ```js import { DomInsertionService, CONTENT_STRATEGY } from '@abp/ng.core'; @Component({ /* class metadata here */ }) class DemoComponent { constructor(private domInsertionService: DomInsertionService) {} ngOnInit() { const scriptElement = this.domInsertionService.insertContent( CONTENT_STRATEGY.AppendScriptToBody('alert()') ); } } ``` Remarkably, we have an `insertContent` method and are passing the content to be inserted to it via a predefined content strategy, `CONTENT_STRATEGY.AppendScriptToBody`. Let's check the content strategy: ```js import { ContentSecurityStrategy, CONTENT_SECURITY_STRATEGY, } from './content-security.strategy'; import { DomStrategy, DOM_STRATEGY } from './dom.strategy'; export abstract class ContentStrategy<T extends HTMLScriptElement | HTMLStyleElement = any> { constructor( public content: string, protected domStrategy: DomStrategy = DOM_STRATEGY.AppendToHead(), protected contentSecurityStrategy: ContentSecurityStrategy = CONTENT_SECURITY_STRATEGY.None(), ) {} abstract createElement(): T; insertElement(): T { const element = this.createElement(); this.contentSecurityStrategy.applyCSP(element); this.domStrategy.insertElement(element); return element; } } export class StyleContentStrategy extends ContentStrategy<HTMLStyleElement> { createElement(): HTMLStyleElement { const element = document.createElement('style'); element.textContent = this.content; return element; } } export class ScriptContentStrategy extends ContentStrategy<HTMLScriptElement> { createElement(): HTMLScriptElement { const element = document.createElement('script'); element.textContent = this.content; return element; } } export const CONTENT_STRATEGY = { AppendScriptToBody(content: string) { return new ScriptContentStrategy(content, DOM_STRATEGY.AppendToBody()); }, AppendScriptToHead(content: string) { return new ScriptContentStrategy(content, DOM_STRATEGY.AppendToHead()); }, AppendStyleToHead(content: string) { return new StyleContentStrategy(content, DOM_STRATEGY.AppendToHead()); }, PrependStyleToHead(content: string) { return new StyleContentStrategy(content, DOM_STRATEGY.PrependToHead()); }, }; ``` Well, apparently, `ContentStrategy` defines a contract that consumes two other strategies: `DomStrategy` and `ContentSecurityStrategy`. We are not going to dig deeper and examine these strategies, but the main takeaway here is that ABP employs a composition of strategies to build complex ones. Another important information here is that, although there are some predefined strategies exported as a constant (`CONTENT_STRATEGY`), using the superclasses and/or constructing different compositions, you can always develop new strategies. Let's take a closer look at the `DomInsertionService` class: ```js import { Injectable } from '@angular/core'; import { ContentStrategy } from '../strategies/content.strategy'; import { generateHash } from '../utils'; @Injectable({ providedIn: 'root' }) export class DomInsertionService { private readonly inserted = new Set<number>(); insertContent<T extends HTMLScriptElement | HTMLStyleElement>( contentStrategy: ContentStrategy<T>, ): T { const hash = generateHash(contentStrategy.content); if (this.inserted.has(hash)) return; const element = contentStrategy.insertElement(); this.inserted.add(hash); return element; } removeContent(element: HTMLScriptElement | HTMLStyleElement) { const hash = generateHash(element.textContent); this.inserted.delete(hash); element.parentNode.removeChild(element); } has(content: string): boolean { const hash = generateHash(content); return this.inserted.has(hash); } } ``` The `insertElement` method of the `ContentStrategy` is called and that's pretty much it. The insertion algorithm is delegated to the strategy. As a result, there is not much to read and maintain here, yet the service is capable of doing almost any DOM insertion we may ever need. ## Conclusion The strategy pattern is a proven and reusable way of making your code more flexible and extensible. It also usually leads to a more readable, testable, and maintainable codebase. ABP Angular services like `LazyLoadService`, `ContentProjectionService`, and `DomInsertionService` already make use of the pattern and we are hoping to deliver more of these services in the near future. Thank you for reading.
I usually run the Web projects from **Visual Studio**. But it was required to run an **ASP.NET Core 3.1 Web** project from the **Command Line** on **IIS Express** At first I thought it was easy! But I stumbled in some places. To keep this article simple, I don’t want to write all my tries. Here’s how you can start an ASP.NET Core 3.1 Web Project from the Command Line **run-host.bat** ```bash SET ASPNETCORE_ENVIRONMENT=DevelopmentSET LAUNCHER_PATH=bin\Debug\netcoreapp3.1\Volo.AbpIo.Account.Web.execd /d "C:\Program Files\IIS Express\"iisexpress.exe /config:"D:\Github\volo\abp\abp_io\.vs\Volo.AbpIo\config\applicationhost.config" /site:"Volo.AbpIo.Account.Web" /apppool:"Volo.AbpIo.Account.Web AppPool" ``` You need to **modify the bold** ones in the batch file. Descriptions: - **LAUNCHER_PATH**: This is the exe file of your Web project. Be careful you don’t provide the full exe path. Set it as relative path. My CSPROJ is located in “*D:\Github\volo\abp\abp_io\src\Volo.AbpIo.Account.Web\Volo.AbpIo.Account.Web.csproj” and I set LAUNCHER_PATH* as *“bin\Debug\netcoreapp3.1\Volo.AbpIo.Account.Web.exe”* - **/config**: This is the *applicationhost.config* file path. My solution path is “*D:\Github\volo\abp\abp_io\Volo.AbpIo.sln*” and *applicationhost.config* is located in the “*D:\Github\volo\abp\abp_io\.vs\Volo.AbpIo\config\applicationhost.config*” - **/site:** You can find the site name in the **applicationhost.config** file. It’s in the ** tag. - **/apppool:** You can find the site name in the **applicationhost.config** file. It’s in the your ** tag. **applicationhost.config File Content:** ```xml <sites> ....<site name="Volo.AbpIo.Account.Web" id="2"> <application path="/" applicationPool="Volo.AbpIo.Account.Web AppPool"> <virtualDirectory path="/" physicalPath="D:\Github\volo\abp\abp_io\src\Volo.AbpIo.Account.Web" /> </application> <bindings> <binding protocol="http" bindingInformation="*:56570:localhost" /> <binding protocol="https" bindingInformation="*:44333:localhost" /> </bindings> </site> .... ``` <br> And this is the GitHub Gist <br> <script src="https://gist.github.com/ebicoglu/fa959ca3e6f8ff0efd0de71df27d8512.js"></script> When you create a batch file as “**run-host.bat**” and run it, it’ll run as seen below:  <center>Command line output window</center> <br> You’ll also see the system tray application of IIS Express. Right click on it and click “Show All Applications” to see your running Web Apps.  <center>Windows System Tray Icon</center> <br> If you encounter any problem while running the application, check your Windows Event Viewer (*eventvwr.exe*) for the detailed logs! <br> Happy coding!
Over the past year, [Volosoft](https://volosoft.com/) has undergone many changes! After months of preparation and some hard work, we moved to our new office towards the end of 2019. We have moved a bit far away and our physical address is changed. You can find us at [Istanbul Ataturk Airport Free Zone](http://www.isbi.com.tr/Default.aspx), offering big incentives to software exporting companies. Volosoft continues to grow and as a natural step of the growth, we needed more space. Our new office is much bigger than the previous one and gives us much better workspace. Besides, more spacious, modern and comfortable as well! One of the biggest decision-making factors to move was our growing team. In 2019, we went from 5 employees to 15, meaning we needed more space for desks, meeting areas and common areas. Since we moved, we’ve added a few more to our team and we’re constantly growing.  We are all settled in and adding some small fixtures & decorations day by day — there are still some small things to do to make here a great place. More greenery, bookshelves, etc. :) But, we’re incredibly excited to be in the new space even now! ------ ## So what’s new? Let’s get into it. - First, *the main office space*: We love open-concept offices.   - Next, *our shiny meeting room*: Everyone in the team is excited about this space as we hold our daily/weekly team meetings here. Free to play games on PS4 when there is no meeting! :)   - See *our common area & sofa*, welcome to employees and guests (it’s our favorite place to have sweet conversations with tea/coffee)   - And, welcome to *Volo Cafe*!, where we are having our lunch, grab a bite to eat various snacks, and doing our celebrations. By the way, we are fun-loving with our game console!     But more important than our office is our people who make here lovable and a joy to work at. We’ve got a great team of creative thinkers, motivated, productive, and lifelong learners. Come [meet the team](https://volosoft.com/MeetTheTeam)!  *<center> We are a team with a singular shared goal: to make the developer’s life easier. </center>* So, as we mentioned above, our office is quite bigger, so we still have space to welcome new peers to our team! [Clik here](https://github.com/volosoft/vs-home/labels/job) to view all open positions. The desk for new peers is always ready before they come to the office! :) <blockquote class="twitter-tweet"><p lang="en" dir="ltr">Welcome <a href="https://twitter.com/muerdem?ref_src=twsrc%5Etfw">@muerdem</a> to Volosoft! Your desk is ready, we are excited to start working with you :) <a href="https://t.co/yfHe816Oid">pic.twitter.com/yfHe816Oid</a></p>— Volosoft (@volosoftcompany) <a href="https://twitter.com/volosoftcompany/status/1221654230949298176?ref_src=twsrc%5Etfw">January 27, 2020</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> ------ We welcomed “Cay Kahve Insan”, a beloved YouTube channel in Turkey to our office. [Check out our office tour video on YouTube](https://www.youtube.com/watch?v=FUjl4WPBOuM&feature=youtu.be) for a sneak peek — to give you an idea of where we are while developing & coding for you and supporting you. (Watch it with English subtitles) <center> <iframe width="560" height="315" src="https://www.youtube.com/embed/FUjl4WPBOuM" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> </center> ------ We consider our new office as the beginning of another chapter in our history. Thanks for being with us throughout this growth! If you are in the area, feel free to stop by and say hello!
Last week, we were at [NDC {London}](https://ndc-london.com/) and had our booth at the 3-day conference to talk about the [ABP.IO](https://abp.io/) platform. We were proud to be a partner of the conference, and being on the same page with other global top leaders, each expert in their fields.  <center> Partners of NDC {London} 2020 </center> [NDC London](https://ndc-london.com/), one of Europe’s well-organized developer conferences, took place last week in the heart of London, Westminster. 800+ developers from all over the world came together to learn from the incredible [speakers](https://ndc-london.com/speakers/). There were 107 international speakers who were leading to talks, panels and workshops. The conference was not only made up of these, but there were also evening events for networking & socializing like a boat cruise, after-parties, comedy shows + never-ending food buffets! 🍱 ------ We presented the ABP.IO platform as a combination of [**ABP Framework**](https://abp.io), community-driven open-source web application framework, and [**ABP Commercial**](https://commercial.abp.io), our enterprise-ready web development platform that is built on top of the open-source ABP Framework. > ABP.IO, a platform with numerous cool built-in features based on ASP.NET Core, provides to focus on what makes web application unique rather than spending weeks for common infrastructure details.  <center> Volosoft Booth </center>  Our core ABP framework development team presented the ABP.IO platform modules and features and talked about software development pains&gains in general. And, then they showed many [demos](https://commercial.abp.io/demo) of ABP Commercial to the attendees who were wondering how exactly it looks like. We were pleased with the traffic of coding enthusiasts!    On the second day, we welcomed [Steve Sanderson](https://twitter.com/stevensanderson?lang=en) and [Ryan Nowak](https://twitter.com/aVerySpicyBoi) to our booth. It was great to talk to again for ABP.IO this time. <blockquote class="twitter-tweet"><p lang="en" dir="ltr"><a href="https://twitter.com/stevensanderson?ref_src=twsrc%5Etfw">@stevensanderson</a> & <a href="https://twitter.com/aVerySpicyBoi?ref_src=twsrc%5Etfw">@aVerySpicyBoi</a> Thanks for coming by and listening <a href="https://twitter.com/abpframework?ref_src=twsrc%5Etfw">@abpframework</a> 🥳<br>It was pleasure to see you again! <a href="https://twitter.com/hashtag/NDCLondon?src=hash&ref_src=twsrc%5Etfw">#NDCLondon</a> (<a href="https://twitter.com/NDC_Conferences?ref_src=twsrc%5Etfw">@NDC_Conferences</a>) <a href="https://t.co/sI3veI5cvz">pic.twitter.com/sI3veI5cvz</a></p>— ABP (@abpframework) <a href="https://twitter.com/abpframework/status/1222865104997441538?ref_src=twsrc%5Etfw">January 30, 2020</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> During the conference, we gave the attendees diverse and pleasant swags: a fancy Bluetooth speaker, stickers, phone hand holder, and tasty chocolate. We‘ve got great feedback from folks on our pretty swags! <blockquote class="twitter-tweet"><p lang="en" dir="ltr">DAY 2 has started! 🌥<br>Visit our booth and grab your swags: A fancy bluetooth speaker, stickers, phone hand holder, and tasty chocolate! 🤩<a href="https://twitter.com/hashtag/NDCLondon?src=hash&ref_src=twsrc%5Etfw">#NDCLondon</a> <a href="https://twitter.com/NDC_Conferences?ref_src=twsrc%5Etfw">@NDC_Conferences</a> <a href="https://t.co/Vpo4piBqKu">pic.twitter.com/Vpo4piBqKu</a></p>— ABP (@abpframework) <a href="https://twitter.com/abpframework/status/1222816601898209280?ref_src=twsrc%5Etfw">January 30, 2020</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>  On the last day of the conference, we held a drawing of the PlayStation 4 raffle with great excitement! The winner became the happiest one departed from the conference. Look how happy he is! :) <blockquote class="twitter-tweet"><p lang="en" dir="ltr">It was a fantastic raffle drawing!Thanks all for joining.<br>👑 And the winner is <a href="https://twitter.com/iSQRT?ref_src=twsrc%5Etfw">@iSQRT</a>, congratulations! Happy playing.👏🏼<a href="https://twitter.com/hashtag/NDCLondon?src=hash&ref_src=twsrc%5Etfw">#NDCLondon</a> <a href="https://twitter.com/NDC_Conferences?ref_src=twsrc%5Etfw">@NDC_Conferences</a> <a href="https://t.co/UWl6uVLrUX">pic.twitter.com/UWl6uVLrUX</a></p>— ABP (@abpframework) <a href="https://twitter.com/abpframework/status/1223284506720972802?ref_src=twsrc%5Etfw">January 31, 2020</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>    We had an amazing time and so much fun meeting all of you! Thanks to all enthusiasts developers and organizers who made it inspiring. We would be happy to stay in touch. Hope to see you next year! More impressions from the conference:    
In this article, I will show you how to integrate the refresh token mechanism to the ASP.NET Zero project. We use Angular `HttpInterceptor` to handle requests. And I will implement how to use refresh tokens using Interceptor. As a summary, the HttpInterceptor works as a middleware between each requests and server. As a default, all your requests enter the HttpInterceptor’s intercept method. And then you can handle the request and release it to the next handler. Our interceptor will work as shown in the below diagram.  Before each request: - Handle request(add auth header etc.) and call server with that request. And subscribe to result. After getting a response: - Check if it is a HTTP 401(unauthorized) result - If it is not, pass it to next handler - If it is HTTP 401: - Check whether if there is an ongoing reauthentication with the refresh token process. - If there is, store requests and wait for the auth result. - Else, try to authenticate with refresh token (If refresh token exists) - If you can auth with a refresh token, store new tokens - Call previous requests which you have got HTTP 401 error. - If there are any stored requests call them with new auth token. **Implementation** We use the `abp-ng2-module` package in Angular projects. It has basic implementations that we may need while developing our Angular projects. We use Angular `HttpInterceptor`to handle requests (adding our headers, handling errors, etc…) and it’s located in the `abp-ng2-module`package. Check it out on GitHub: [https://github.com/aspnetboilerplate/abp-ng2-module/blob/master/projects/abp-ng2-module/src/lib/interceptors/abpHttpInterceptor.ts](https://github.com/aspnetboilerplate/abp-ng2-module/blob/master/projects/abp-ng2-module/src/lib/interceptors/abpHttpInterceptor.ts) We use that interceptor in the ASP.NET Zero project. **Let’s start coding.** Since we have two seperate projects and our interceptor don’t know about a client I create abstract service which responsible for authorization with refresh token (if it exists). <script src="https://gist.github.com/demirmusa/043108023ce21c7d855f4176754b1423.js"></script> Our interceptor was as seen below <script src="https://gist.github.com/demirmusa/7f1a4a54b77572f8d3978bfd3116b192.js"></script> What we need to do is; catch errors on requests and handle it, so we can change the intercept method as seen below <script src="https://gist.github.com/demirmusa/27a5c25765f1b6f19c1ed588ea7a3e2a.js"></script> <script src="https://gist.github.com/demirmusa/a6efa99354d2348957cc14a346b2067b.js"></script> [aspnetboilerplate/abp-ng2-module/abpHttpInterceptor.tsTake a look at all the codes in interceptorgithub.com](https://github.com/aspnetboilerplate/abp-ng2-module/blob/master/projects/abp-ng2-module/src/lib/interceptors/abpHttpInterceptor.ts) The `abp-ng2-module` package is now ready to use with the refresh token. Time to update ASP.NET Zero. I updated `abp-ng2-module` packages on zero and created a service called ‘ZeroRefreshTokenService’ at the relevant location. <script src="https://gist.github.com/demirmusa/d871caf0587c054b8e23ab13a87fc712.js"></script> Finally, I have added my token service to the providers of my module. ```ts @NgModule({ ... providers:[ ... { provide: RefreshTokenService, useClass: ZeroRefreshTokenService} ] }) ``` As you see below, `GetEditionComboboxItem `, `GetTenants `actions are called and we get HTTP401 unauthorized results, then our interceptor calls `RefreshToken`action and successfully gets the new token. Then it recalls all the unauthorizated requests with the new token. 
Last week we were in the Netherlands to participate in Techorama 2019 Conference. Techorama is an international software development conference which takes place in Ede. It was a quite big organization with over 30 partners and over 1000 attendees. The venue was a cinema and whole venue was closed for this event. Volosoft is proud to be a Platinum Partner of Techorama.  It started on Tuesday, October 1 and continued with the next day. So 2 full days from 07:30 to 18:00 and 10 parallel sessions. There were many different topics like AI & IOT & ML, Cloud Services, Data Platform and BI, Developer trends, Dev-Ops and Architecture, Motivational and self-improvement, Back-end and Front-end technologies…  There were 90 speakers from different parts of the world like Mads Torgersen, Scott Allen, Dan Wahlin, Jimmy Bogard, Pinal Dave, Richard Campbell and also our lead architect **Halil Ibrahim Kalkan** had a session about “Implementing Domain Driven Design”. The room was full and even some visitors were standing on the walls to listen to Halil’s talk. It was really great to see this high of reaction.  We were the only company who was there for an open-source product. [ABP.IO](https://abp.io/) is our new approach to a micro-service compatible & multi tenancy enabled solution. It has numerous of cool features to start a new project rather than a scratch ASP.NET Core project. We had a lot of people visiting our booth, people were all interested in this open-source ASP.NET Core 3.0 compatible web application framework!  We brought lots of cool swags to the Netherlands. Nice colorful Bluetooth speakers, hand-made chocolates. As always, we had some great fun during the drawing of our Play Station 4 raffle. Besides we made a present of ASP.NET Zero Ultimate License which costs like $4499.  <blockquote class="twitter-tweet"><p lang="en" dir="ltr">Congratulations to <a href="https://twitter.com/mikkeljohansen?ref_src=twsrc%5Etfw">@mikkeljohansen</a> for winning the Playstation4 prize by Volosoft at <a href="https://twitter.com/hashtag/techorama?src=hash&ref_src=twsrc%5Etfw">#techorama</a> Netherlands! <a href="https://t.co/lO3wnC4dkH">pic.twitter.com/lO3wnC4dkH</a></p>— Volosoft (@volosoftcompany) <a href="https://twitter.com/volosoftcompany/status/1179401123410698241?ref_src=twsrc%5Etfw">October 2, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> Check out Techorama Netherlands 2019 by pictures:        See you next year !
In this article, I will integrate ASP.NET health check to the ASP.NET Boilerplate project. ## What is ASP.NET Health Check ASP.NET Core offers Health Check Middleware and libraries for reporting the health of app infrastructure components. It allows you to check the health of the application. There are dozens of libraries you can use with health checks. And you can also create your own health checks. Let’s start. - Download your ASP.NET Boilerplate application. Go to [https://aspnetboilerplate.com/Templates](https://aspnetboilerplate.com/Templates) and download your .NET Core application and make the first setup (update DB, etc.) (In this article I will use Multi-Page Web Application) > See: [https://aspnetboilerplate.com/Pages/Documents](https://aspnetboilerplate.com/Pages/Documents) for more information. - Add `Microsoft.AspNetCore.Diagnostics.HealthChecks` NuGet package to your `*.Web.Mvc` project. - Open `*.Web.Mvc` project’s `Startup.cs` file and add health checks middleware as seen below. ```csharp public class Startup { public void ConfigureServices(IServiceCollection services) { ... services.AddHealthChecks(); ... } public void Configure(IApplicationBuilder app, IHostingEnvironment env) { ... app.UseHealthChecks("/health");//your request URL will be health ... } } ```  Now it returns healthy string to show us the system is up and healthy. ## Add Health Check UI There are dozens of libraries that you can use with health check. Let’s use AspNetCore.HealthChecks.UI AspNetCore.HealthChecks.UI is a library that gives you a nice looking user interface. Every health check that you add will be automatically added to the user interface . - Add `AspNetCore.HealthChecks.UI` NuGet package to your `*.Web.Mvc` project - Open `*.Web.Mvc` project’s `Startup.cs` file and change it as shown below. ```csharp public class Startup { public void ConfigureServices(IServiceCollection services) { ... services.AddHealthChecks(); services.AddHealthChecksUI(); ... } public void Configure(IApplicationBuilder app, IHostingEnvironment env) { ... app.UseHealthChecks("/health", new HealthCheckOptions() { Predicate = _ => true, ResponseWriter = UIResponseWriter.WriteHealthCheckUIResponse }); app.UseHealthChecksUI(); ... } } ``` - Add your UI settings to `*.Web.Mvc` project’s `appsettings.json` file. ```csharp "HealthChecksUI": { "HealthChecks": [ { "Name": "HealthCheckExample", "Uri": "http://localhost:62114/health" } ], "EvaluationTimeOnSeconds": 10, "MinimumSecondsBetweenFailureNotifications": 60 } ``` After that, the response from the URL `/health` will be like below. [http://localhost:62114/health](http://localhost:62114/healthz)  And you will have health checks ui. [http://localhost:62114/healthchecks-ui](http://localhost:62114/healthchecks-ui#/healthchecks)  ## Create Your Own Health Checks To check something that you want you can use existing libraries or you can create your own health checker classes inherited from `IHealthCheck`. To use an existing library see their documentation. I will create a custom health checker. - Add `Microsoft.Extensions.Diagnostics.HealthChecks.Abstraction` NuGet package to your `*.Application` project. - Create a folder named `HealthChecks` in `*.Application` project and create your custom health check service class in it. (I will create DB context checker named `HealthCheckExampleDbContextCheck` ) <script src="https://gist.github.com/demirmusa/886f68ddb2b8ebe9b0520a309792a244.js"></script> - Instead of adding all of your health check in `Startup.cs` we will create health check builder to `*.Web.Core`. Create `HealthChecks` folder in the `*.Web.Core` project. Create a class named `HealthCheckExampleHealthCheckBuilder`as shown below. - Use your new builder in `Startup.cs` <script src="https://gist.github.com/demirmusa/b93c71c78ca04bba0cda38f41d1f2748.js"></script> <script src="https://gist.github.com/demirmusa/b93c71c78ca04bba0cda38f41d1f2748.js"></script> Finally you can see the health check status as seen below:  You can check the health status of the following list or any other 3rd parties. - Sql Server, MySql, Oracle … - EventStore - RabbitMQ - Elasticsearch - Redis - … See [https://github.com/Xabaril/AspNetCore.Diagnostics.HealthChecks](https://github.com/Xabaril/AspNetCore.Diagnostics.HealthChecks) for more. You can add more and more to your health check builder and check all dependencies of your application. Check your DB, micro-services, Redis, Azure Storage, Identity Server, Uris, local storage, etc… whatever you need. See also [https://docs.docker.com/engine/reference/builder/#healthcheck](https://docs.docker.com/engine/reference/builder/#healthcheck) **Example project GitHub URL:** [https://github.com/aspnetboilerplate/aspnetboilerplate-samples/tree/master/HealthCheck/aspnet-core](https://github.com/aspnetboilerplate/aspnetboilerplate-samples/tree/master/HealthCheck/aspnet-core)
In this article, I will try to give a brief information about Azure Cosmos DB and introduce common concepts of Azure Cosmos DB. Azure Cosmos DB is getting very popular and Entity Framework Core is going to support it for its 3.0 version, see [https://docs.microsoft.com/en-us/ef/core/providers/#cosmos-db](https://docs.microsoft.com/en-us/ef/core/providers/#cosmos-db). So, what is Azure Cosmos DB ? It is a planet-scale document database which is an evolution of Azure Document DB. When you login to Azure Portal, you can easily create an Azure Cosmos DB database and collection(s) on that database. You can think collection as a table on a relational database without a schema. # Pricing Before we start, let’s talk about pricing. Your usage is priced based on Request Units. You can think of RUs per second as the currency for throughput. - Write will cost more than read - Query on an indexed data costs less - Size of the data affects RUs and price as well - Latency option affects pricing (we will talk about it later) - Indexing policy also affects pricing It will be better to calculate your needs in advance so that you will not see any surprising bill on your Azure account. You can use Microsoft’s capacity planner for Document DB (named to Cosmos DB) [https://www.documentdb.com/capacityplanner](https://www.documentdb.com/capacityplanner). Another good thing is, each request to Azure Cosmos DB returns used RUs to you so you can decide whether stop your requests for a while or increase the RU limit of your collection on Azure portal. # Multi API support Azure Cosmos DB supports 5 type of APIs. - SQL API (Json) - MongoDB API (Bson) - Gremlin API (Graph) - Table API (Key-Value) - Cassandra API (columnar) Although Azure Cosmos DB offers 5 different API models, all different data models are stored as ARS (Atom Record Sequence). It also supports Stored Procedures, User Defined Functions and Triggers. Isn’t that great 😉 # Horizontal partitioning One of the powerful side of Azure Cosmos Db, it can partition your data automatically. In order Cosmos DB to do that, you need to define a partition key for your collection while you are creating your collection. Let’s say that you have a Database named **Travel** and have a collection named **Hotels**. Here is a sample hotel document for the Hotels collection ```json { “id”: “b318aeb0–4b0c-4ef0–8d4b-ddd10f502033”, “name”: “Villa Borghese”, “country”: “Italy”, “city”: “Rome” } ``` If you select **city** as the partition key for the Hotels collection, Cosmos DB will automatically partition your collection when your data grows.  It is crucial to select the correct partition key here. Cosmos DB will handle the rest. In some cases, some of your partitions might receive much more requests (writes) than other partitions. Such partitions are called hot partitions. In that case, it is better to create another collection only for this specific partition and use another partition key. For example, assume that city of Rome receives so much hotel creation requests. In that case, you can create another collection (Hotels_Rome) and use **district** as the partition key for this new collection. For a Multi-Tenant app, TenantId can be used as a partition key for most of the collections.  ## Cross Partition Queries Such a great benefit comes with a disadvantage of course. It is suggested not to run cross partition queries because it will be slow and it will cost more. For cross partition queries, Cosmos DB will run your query on more than one partition, merge them and return to you. If you still need to run cross partition queries, you need to explicitly specify it in your FeedOptions: ```csharp var option = new FeedOptions { EnableCrossPartitionQuery = true }; ``` # Replication (Globally distributing data) Azure Cosmos DB automatically replicates your data over the available data centers. But, why we should replicate the data ? Answer is simple, performance. If your app is closer to data source, it will retrieve the data faster and your users will have a better experience with your app. Here is a screenshot of Azure Cosmos DB’s replication screen:  Here you can easily select where to replicate your data. There are two regions on Azure Cosmos DB, write regions and read regions. As you can easily understand, data can be written to write regions and data can be read from both write and read regions. You can also define fail-over regions, so Azure can fallback to next region when a query execution fails. ## Consistency Replication comes with a choice of consistency. So, when one instance of your app writes data to a write-region, Azure needs to replicate this data to other regions. Azure Cosmos DB offers 5 type of consistency levels. It means, you need to select how Azure should replicate your data between your Azure Cosmos DB regions. Let’s see what are those consistency levels: **Strong** In this model, there are no dirty reads. It means, when a data is updated, everybody will read the old value until the data is replicated to all regions. This is the slowest option. **Bounded Staleness** In this option, you can define period of time or update count for the staleness of your data. You can say that, no dirty reads for 1 minute or no dirty reads for data updated more than 5 times. When you set the time option to 0, it will be exactly same as Strong consistency option. **Session** In this option, no dirty reads are possible for writers but dirty reads are possible for readers. This is the default option. So, if you are the one writing the data, you can read that data. But for others, they can read stale data for a while. **Consistent Prefix** In this option dirt-reads are possible but they are always on order. So, if a data is updated with the values 1,2,3 in order, readers always see the updated data in this order. No one will see the value 3 before 2. **Eventual** In this option, dirty reads are possible and there is no guarantee of order. So, if a data is updated with the values 1,2,3 in order, a reader can see value 3 before seeing value 2. But, this is the fastest option. Here is a commonly used image for showing consistency options of Azure Cosmos DB:  # Resource Model Azure Cosmos DB adds additional fields to your documents. Here you can see a sample document model:  If you don’t set a value for **id** field, Azure Cosmos DB will automatically assign a GUID value. # Migrations For Migrating your existing database to Azure Cosmos DB, there is an open-source tool which you can find on [https://azure.microsoft.com/en-us/updates/documentdb-data-migration-tool/](https://azure.microsoft.com/en-us/updates/documentdb-data-migration-tool/). This tool doesn’t offer much but it understands column names on your table or views on SQL Server and converts them to hierarchical data on Azure Cosmos DB. Assume that we have a view like below: ```sql SELECT Name, Country AS “Address.Country”, City AS “Address.City” FROM Hotels ``` Migrator tool can convert result of this query (your view) to something like below:  # Azure Cosmos DB UI For me, one of the most impressive thing about Azure Cosmos DB, it has a local emulator. You can install it on your computer and emulate Azure Cosmos DB on your computer. You can download the emulator on [https://aka.ms/cosmosdb-emulator](https://aka.ms/cosmosdb-emulator). By using this emulator, you can even calculate the cost of your queries and make assumption about the cost of your Azure Cosmos DB usage. # Nuget Packages Currently, Cosmos DB support for Entity Framework Core is no released. Until it is released, you can use the packages below to connect & use Azure Cosmos DB from your C# application. For Full .NET Framework [https://www.nuget.org/packages/Microsoft.Azure.DocumentDB/](https://www.nuget.org/packages/Microsoft.Azure.DocumentDB/) For .NET Core [https://www.nuget.org/packages/Microsoft.Azure.DocumentDB.Core/](https://www.nuget.org/packages/Microsoft.Azure.DocumentDB.Core/)
In this tutorial, we will implement a PDF exporter functionality. We will use open source [**DinkToPdf**](https://github.com/rdvojmoc/DinkToPdf) library to convert HTML to PDF. ## Using DinkToPdf Library First, we install DinkToPdf package to **.application** project.  Then, we need to download the library from [**GitHub repository**](https://github.com/rdvojmoc/DinkToPdf) and copy **v0.12.4** folder to inside of “/aspnet_core/src/{ProjectName}.Web.Mvc/wkhtmltox” folder that we have created.   Finally, we need to add these lines to {ProjectName}.Web.Mvc.csproj: ```xml <ItemGroup> <None Update="log4net.config"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> <CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory> </None> <None Update="wkhtmltox\v0.12.4\32 bit\libwkhtmltox.dll"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> <CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory> </None> <None Update="wkhtmltox\v0.12.4\32 bit\libwkhtmltox.dylib"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> <CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory> </None> <None Update="wkhtmltox\v0.12.4\32 bit\libwkhtmltox.so"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> <CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory> </None> <None Update="wkhtmltox\v0.12.4\64 bit\libwkhtmltox.dll"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> <CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory> </None> <None Update="wkhtmltox\v0.12.4\64 bit\libwkhtmltox.dylib"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> </None> <None Update="wkhtmltox\v0.12.4\64 bit\libwkhtmltox.so"> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> </None> <None Update="wwwroot\**\*"> <CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory> <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory> </None> </ItemGroup> ``` ## Loading Assemblies We need to load the assemblies, that we have copied to **wkhtmltox** folder, when application is started. Firstly, we will create a new class named “CustomAssemblyLoadContext” in startup folder of MVC project like below: ```csharp using System; using System.Runtime.Loader; using System.Reflection;public class CustomAssemblyLoadContext : AssemblyLoadContext { public IntPtr LoadUnmanagedLibrary(string absolutePath) { return LoadUnmanagedDll(absolutePath); } protected override IntPtr LoadUnmanagedDll(String unmanagedDllName) { return LoadUnmanagedDllFromPath(unmanagedDllName); } protected override Assembly Load(AssemblyName assemblyName) { throw new NotImplementedException(); } } ``` And then, use it in ConfigureServices method of startup.cs: ```csharp using DinkToPdf; using DinkToPdf.Contracts; //....public class Startup { private readonly IConfigurationRoot _appConfiguration; private readonly IHostingEnvironment _hostingEnvironment; public Startup(IHostingEnvironment env) { _appConfiguration = env.GetAppConfiguration(); _hostingEnvironment = env; } public IServiceProvider ConfigureServices(IServiceCollection services) { //Other codes... var architectureFolder = (IntPtr.Size == 8) ? "64 bit" : "32 bit"; var wkHtmlToPdfPath = Path.Combine(_hostingEnvironment.ContentRootPath, $"wkhtmltox\\v0.12.4\\{architectureFolder}\\libwkhtmltox"); CustomAssemblyLoadContext context = new CustomAssemblyLoadContext(); context.LoadUnmanagedLibrary(wkHtmlToPdfPath); services.AddSingleton(typeof(IConverter), new SynchronizedConverter(new PdfTools())); //Other codes... }//Other codes... } ``` Now we are ready to implement the functionality. ## Generating a PDF file from HTML We will convert user list to PDF in this tutorial. So respectively we need to get user list, place it in a HTML table and convert to PDF . We will create **UserListPdfExporter** class for this: ```csharp using System.Collections.Generic; using System.Text; using ModuleZeroPdfCreationDemo.Authorization.Users; using Abp.Dependency; using Abp.Domain.Repositories; using System.Threading.Tasks; using DinkToPdf; using DinkToPdf.Contracts;namespace ModuleZeroPdfCreationDemo.Users.exporting { public class UserListPdfExporter : ITransientDependency { private readonly IRepository<User, long> _userRepository; private readonly IConverter _converter; public UserListPdfExporter(IRepository<User, long> userRepository, IConverter converter) { _userRepository = userRepository; _converter = converter; } public async Task<FileDto> GetUsersAsPdfAsync() { var users = await _userRepository.GetAllListAsync(); var html = ConvertUserListToHtmlTable(users); var doc = new HtmlToPdfDocument() { GlobalSettings = { PaperSize = PaperKind.A4, Orientation = Orientation.Portrait }, Objects = { new ObjectSettings() { HtmlContent = html } } }; return new FileDto("UserList.pdf", _converter.Convert(doc)); } private string ConvertUserListToHtmlTable(List<User> users) { var header1 = "<th>Username</th>"; var header2 = "<th>Name</th>"; var header3 = "<th>Surname</th>"; var header4 = "<th>Email Address</th>"; var headers = $"<tr>{header1}{header2}{header3}{header4}</tr>"; var rows = new StringBuilder(); foreach (var user in users) { var column1 = $"<td>{user.UserName}</td>"; var column2 = $"<td>{user.Name}</td>"; var column3 = $"<td>{user.Surname}</td>"; var column4 = $"<td>{user.EmailAddress}</td>"; var row = $"<tr>{column1}{column2}{column3}{column4}</tr>"; rows.Append(row); } return $"<table>{headers}{rows.ToString()}</table>"; } } public class FileDto { public string FileName { get; set; } public byte[] FileBytes { get; set; } public FileDto(string fileName, byte[] fileBytes) { FileName = fileName; FileBytes = fileBytes; } } } ``` ## Download User List as PDF To download the PDF that we created using **UserListPdfExporter**, we will add a new method to **UsersController**: ```csharp using ModuleZeroPdfCreationDemo.Users.exporting; //...namespace ModuleZeroPdfCreationDemo.Web.Controllers { public class UsersController : ModuleZeroPdfCreationDemoControllerBase { //... private readonly UserListPdfExporter _userListPdfExporter;** public UsersController(/*Other codes....*/ , UserListPdfExporter userListPdfExporter) { //... _userListPdfExporter = userListPdfExporter; }//Other codes.... public async Task<ActionResult> DownloadAsPdfAsync() { var file = await _userListPdfExporter.GetUsersAsPdfAsync(); return File(file.FileBytes, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet", file.FileName); } } } ``` And then we can add a new element to UI for PDF export button. In Index.cshtml: ```html <button type="button" class="btn btn-primary btn-circle waves-effect waves-circle waves-float pull-right" id="ExportToPdfButton"> <i class="material-icons">cloud_download</i> </button> ``` In Index.js: ```javascript $('#ExportToPdfButton').click(function () { location.href = abp.appPath + 'Users/DownloadAsPdfAsync'; }); ``` And we are ready to use the new functionality.  
# Building and consuming GraphQL in your .NET application GraphQL is a single API face for different clients like mobile, desktop apps, tablets… It’s built by Facebook. > It’s not a magical library that solves all our Web API issues but it helps us to provide our clients a very flexible Web API. I wish we would provide GraphQL the EF DbContext and it would solve everything. But things are not so easy! In this document we will see what’s GraphQL, what it solves, how to build a simple GraphQL back-end for our ASP.NET Core project and also how to consume this GraphQL. Before starting, notice that you can find the demonstrated project on [https://github.com/ebicoglu/AspNetCoreGraphQL-MyHotel](https://github.com/ebicoglu/AspNetCoreGraphQL-MyHotel)  <center> GraphQL & Client transmission </center>  <center> GraphQL & Clients </center>  <center> Rest API & GraphQL pipelines </center> Note that GraphQL doesn’t need HTTP to work. HTTP is a transporting way to send data to the API. Each request will have unique URLs that’s why HTTP caching is not possible!   # GraphQL Language Basics You can query popular APIs using GraphQL in your browser from [https://www.graphqlhub.com/playground](https://www.graphqlhub.com/playground) So I’ll play with GitHub’s GraphQL schema. The below query gets the user with the username “*ebicoglu”* and including repositories of the user. And we name this query “TestQuery”. ```GraphQL query TestQuery { graphQLHub github { user(username: "ebicoglu") { id login company avatar_url repos { name } } } } ```  <center> A sample request & response </center> # Variables You can use variables in your queries. It’s similar to SQL variables. Not to make string concatenate we supply variables in a different parameter. ```GraphQL query TestQuery($currentUserName: String!) { graphQLHub github { user(username: $currentUserName) { id login company avatar_url repos { name } } } } ```  # Directives Directives provide a way describe additional options to the GraphQL executor. Essentially a directive allows the GraphQL to change the result of our queries based on criteria we provide. It can be used for permission management. There are **@skip** and **@include** directives. The skip directive, when used on fields or fragments, allows us to exclude fields based on some condition. The include directive, allows us to include fields based on some condition. You see an include directive usage in the below sample.  # Aliases Sometimes UI uses different field names rather than what it comes from the host. You can rename a field so that it matches your UI fields and you don’t need to transform the data. In the following query we rename fields - login **> user_name** - company **>** **company_name**. ```GraphQL query TestQuery($currentUserName: String!, $includeRepos: Boolean!) { graphQLHub github { user(username: $currentUserName) { id user_name: login company_name: company avatar_url repos @include(if: $includeRepos) { name } } } } ```  <center> Using aliases to rename fields </center> # Fragments A fragment is a shared piece of query logic. It reduces code repeat. > A fragment is a template of variables. In the following query **UserInfo** is a fragment declared out of the query bracelets. Use three dots (…) prefix when to use a fragment. ```GraphQL query TestQuery { github { user1: user(username: "ebicoglu") { ...UserInfo } user2: user(username: "shanselman") { ...UserInfo } } }fragment UserInfo on GithubUser { id login company } ```  # Mutations We focus on only data fetching. But mutations allow to send data to the server. We will not cover this subject. For more information you can check out [https://graphql.org/learn/queries/#mutations](https://graphql.org/learn/queries/#mutations) ------ # Building GraphQL APIs with ASP.NET Core ## Preparing Project 1- Create an ASP.NET Core project. I prefer an **ASP.NET Core Angular** project to have UI capabilities. So create an empty folder, open CMD in the folder and write the below command to create your empty Angular project. ```bash dotnet new angular ``` 2- Prepare the project for basic CRUD operations. In our sample, the sample project is an hotel reservation application. 2.1 — Create entities: Guest, Room and Reservation. ```csharp public class Guest { [Key] public int Id { get; set; } [Required] [StringLength(300)] public string Name { get; set; } public DateTime RegisterDate { get; set; } public Guest() { } public Guest(string name, DateTime registerDate) { Name = name; RegisterDate = registerDate; } } public enum RoomStatus { Unavailable = 0, Available = 1, } public class Room { [Key] public int Id { get; set; } [Required] public int Number { get; set; } [StringLength(200)] public string Name { get; set; } [Required] public RoomStatus Status { get; set; } public bool AllowedSmoking { get; set; } public Room() { } public Room(int number, string name, RoomStatus status, bool allowedSmoking) { Number = number; Name = name; Status = status; AllowedSmoking = allowedSmoking; } } public class Reservation { [Key] public int Id { get; set; } [ForeignKey("RoomId")] public Room Room { get; set; } public int RoomId { get; set; } [ForeignKey("GuestId")] public Guest Guest { get; set; } public int GuestId { get; set; } [Required] public DateTime CheckinDate { get; set; } public DateTime CheckoutDate { get; set; } public Reservation() { } public Reservation(DateTime checkinDate, DateTime checkoutDate, int roomId, int guestId) { CheckinDate = checkinDate; CheckoutDate = checkoutDate; RoomId = roomId; GuestId = guestId; } } ``` 2.2 —Reference [Microsoft.EntityFrameworkCore](https://www.nuget.org/packages/Microsoft.EntityFrameworkCore/) package from NuGet. 2.3 — Create your DbContext. ```csharp public class MyHotelDbContext : DbContext { public static string DbConnectionString = "Server=localhost; Database=MyHotelDb; Trusted_Connection=True;"; public MyHotelDbContext(DbContextOptions<MyHotelDbContext> options) : base(options) { } public DbSet<Reservation> Reservations { get; set; } public DbSet<Guest> Guests { get; set; } public DbSet<Room> Rooms { get; set; } protected override void OnModelCreating(ModelBuilder modelBuilder) { //GUESTS modelBuilder.Entity<Guest>().HasData(new Guest("Alper Ebicoglu", DateTime.Now.AddDays(-10)) { Id = 1 }); modelBuilder.Entity<Guest>().HasData(new Guest("George Michael", DateTime.Now.AddDays(-5)) { Id = 2 }); modelBuilder.Entity<Guest>().HasData(new Guest("Daft Punk", DateTime.Now.AddDays(-1)) { Id = 3 }); //ROOMS modelBuilder.Entity<Room>().HasData(new Room(101, "yellow-room", RoomStatus.Available, false) { Id = 1 }); modelBuilder.Entity<Room>().HasData(new Room(102, "blue-room", RoomStatus.Available, false) { Id = 2 }); modelBuilder.Entity<Room>().HasData(new Room(103, "white-room", RoomStatus.Unavailable, false) { Id = 3 }); modelBuilder.Entity<Room>().HasData(new Room(104, "black-room", RoomStatus.Unavailable, false) { Id = 4 }); //RESERVATIONS modelBuilder.Entity<Reservation>().HasData(new Reservation(DateTime.Now.AddDays(-2), DateTime.Now.AddDays(3), 3, 1) { Id = 1 }); modelBuilder.Entity<Reservation>().HasData(new Reservation(DateTime.Now.AddDays(-1), DateTime.Now.AddDays(4), 4, 2) { Id = 2 }); base.OnModelCreating(modelBuilder); } } ``` 2.4 — Create a repository for the Reservation entity ```csharp public class ReservationRepository { private readonly MyHotelDbContext _myHotelDbContext; public ReservationRepository(MyHotelDbContext myHotelDbContext) { _myHotelDbContext = myHotelDbContext; } public async Task<List<T>> GetAll<T>() { return await _myHotelDbContext .Reservations .Include(x => x.Room) .Include(x => x.Guest) .ProjectTo<T>() .ToListAsync(); } public async Task<IEnumerable<Reservation>> GetAll() { return await _myHotelDbContext .Reservations .Include(x => x.Room) .Include(x => x.Guest) .ToListAsync(); } } ``` 2.4 — Add DbContext and your repository to the ASP.NET Core Startup.cs. ```csharp public void ConfigureServices(IServiceCollection services) { //... services.AddDbContext<MyHotelDbContext>(options => options.UseSqlServer(MyHotelDbContext.DbConnectionString)); services.AddTransient<ReservationRepository>(); //... } ``` 2.5 — We’ll use AutoMapper for mapping objects from entity to model/DTO. This is optional but you can add [AutoMapper ](https://www.nuget.org/packages/AutoMapper/)package to your project. AutoMapper must be configured in the startup of your project. So write the below line in Configure method of Startup.cs ```csharp public void Configure(IApplicationBuilder app, IHostingEnvironment env, MyHotelDbContext dbContext) { //... InitializeMapper(); }private static void InitializeMapper() { Mapper.Initialize(x => { x.CreateMap<Guest, GuestModel>(); x.CreateMap<Room, RoomModel>(); x.CreateMap<Reservation, ReservationModel>(); }); } ``` 2.6 — Add Entity Framework migrations. And apply migration to create your database. Write the below commands to Package Manager Console. ```bash add-migration "InitialCreate" update-database ``` Till now you’ve created your entities, configured Entity Framework library and created a repository to get all reservations. To be able to list the reservations let’s change the default Angular template. First of all, create a controller to feed the Angular client. The URL for listing all reservations will be *http://\*domain\*/reservations/list* ```csharp [Route("api/[controller]")] public class ReservationsController : Controller { private readonly ReservationRepository _reservationRepository; public ReservationsController(ReservationRepository reservationRepository) { _reservationRepository = reservationRepository; } [HttpGet("[action]")] public async Task<List<ReservationModel>> List() { return await _reservationRepository.GetAll<ReservationModel>(); } } ``` The new Angular template has a fetch-data page to show sample data. Let’s modify *fetch-data.component.ts* to show our data. ```ts import { Component, Inject } from '@angular/core'; import { HttpClient } from '@angular/common/http'; @Component({ selector: 'app-fetch-data', templateUrl: './fetch-data.component.html' }) export class FetchDataComponent { public reservations: Reservation[]; constructor(http: HttpClient, @Inject('BASE_URL') baseUrl: string) { http.get<Reservation[]>(baseUrl + 'api/Reservations/List').subscribe(result => { this.reservations = result; }, error => console.error(error)); } } interface Room { id: number; number: number; name: string; status: string; allowedSmoking: boolean; } interface Guest { id: number; name: string; registerDate: Date; } interface Reservation { checkinDate: Date; checkoutDate: Date; room: Room; guest: Guest; } ``` Now change the content of *fetch-data.component.html.* ```html <h1>Reservations ({{reservations ? reservations.length : 0}})</h1> <p *ngIf="!reservations"><em>Loading...</em></p> <table class='table' *ngIf="reservations"> <thead class="thead-dark"> <tr> <th>Guest Name</th> <th>Room Number</th> <th>Checkin Date</th> <th>Checkout Date</th> </tr> </thead> <tbody> <tr *ngFor="let reservation of reservations"> <td>{{ reservation.guest.name }}</td> <td>{{ reservation.room.number }}</td> <td>{{ reservation.checkinDate | date: 'dd.MM.yyyy'}}</td> <td>{{ reservation.checkoutDate | date: 'dd.MM.yyyy' }}</td> </tr> </tbody> </table> ``` Run the application and click the *Fetch Data* link to list our reservations on the page. This fetches the data from ReservationsController which is a classical RestAPI. Now we’ll add GraphQL to our project. 3- Add the [GraphQL](https://www.nuget.org/packages/GraphQL/) package. The GitHub link is [https://github.com/graphql-dotnet/graphql-dotnet](https://github.com/graphql-dotnet/graphql-dotnet)  Also add the following packages (needs later on) - [GraphQL.Server.Ui.Playground](https://www.nuget.org/packages/GraphQL.Server.Ui.Playground) (an API explorer like Swagger) - [GraphQL.Server.Transports.AspNetCore](https://www.nuget.org/packages/GraphQL.Server.Transports.AspNetCore) (Middleware for ASP.NET Core) Simply you can add the below 3 lines to *MyHotel.csproj* ```xml <ItemGroup> <PackageReference Include="GraphQL" Version="2.4.0" /> <PackageReference Include="GraphQL.Server.Transports.AspNetCore" Version="3.4.0" /> <PackageReference Include="GraphQL.Server.Ui.Playground" Version="3.4.0" /> </ItemGroup> ``` 4- Create *GraphQL Type* for each of your entities: ```csharp public class GuestType : ObjectGraphType<Guest> { public GuestType() { Field(x => x.Id); Field(x => x.Name); Field(x => x.RegisterDate); } } public class ReservationType : ObjectGraphType<Reservation> { public ReservationType() { Field(x => x.Id); Field(x => x.CheckinDate); Field(x => x.CheckoutDate); Field<GuestType>(nameof(Reservation.Guest)); Field<RoomType>(nameof(Reservation.Room)); } } public class RoomStatusType: EnumerationGraphType<RoomStatus> { } public class RoomType : ObjectGraphType<Room> { public RoomType() { Field(x => x.Id); Field(x => x.Name); Field(x => x.Number); Field(x => x.AllowedSmoking); Field<RoomStatusType>(nameof(Room.Status)); } } ``` <center> <img src="https://miro.medium.com/max/772/1*I5MkVvyCuzmhXZMTzh1OjQ.png"> </center> <center> Mapping for GraphQL and .NET </center> Create a schema in the project as below: ```csharp public class MyHotelSchema : Schema { public MyHotelSchema(IDependencyResolver resolver) : base(resolver) { Query = resolver.Resolve<MyHotelQuery>(); } } ``` Now add your new Schema, Graph Types and GraphQL middle ware to your ASP.NET Core startup. ```csharp public void ConfigureServices(IServiceCollection services) { //... services.AddScoped<IDependencyResolver>(x => new FuncDependencyResolver(x.GetRequiredService)); services.AddScoped<MyHotelSchema>(); services.AddGraphQL(x => { x.ExposeExceptions = true; //set true only in development mode. make it switchable. }) .AddGraphTypes(ServiceLifetime.Scoped); } ``` And in the Configure method of Startup.cs use GraphQL. Also we’ll use GraphQL Playground library which is cool to explore the data with GrapQL. It’s similar to Swagger. ```csharp public void Configure(IApplicationBuilder app, IHostingEnvironment env, MyHotelDbContext dbContext) { //... app.UseGraphQL<MyHotelSchema>(); app.UseGraphQLPlayground(new GraphQLPlaygroundOptions()); //to explorer API navigate https://*DOMAIN*/ui/playground //... UseMVC .... } ``` Run the project and navigate to the following URL ```GraphQL https://*DOMAIN*/ui/playground ``` To see your reservation data in the GraphQL Playground page you can write the below query. ```GraphQL query TestQuery { reservations { id checkinDate checkoutDate guest { id name registerDate } room { id name number allowedSmoking status } } } ``` This query is a request to get the list of reservations including *Guest* and *Room* child entities. You can try different GraphQL queries to test it out. 5- Filtering Data To accept arguments from your query, you need to add arguments to your query. Each argument can be added with a type, description and default value. ```csharp public class MyHotelQuery : ObjectGraphType { public MyHotelQuery(ReservationRepository reservationRepository) { Field<ListGraphType<ReservationType>>("reservations", arguments: new QueryArguments(new List<QueryArgument> { new QueryArgument<IdGraphType> { Name = "id" }, new QueryArgument<DateGraphType> { Name = "checkinDate" }, new QueryArgument<DateGraphType> { Name = "checkoutDate" }, new QueryArgument<BooleanGraphType> { Name = "roomAllowedSmoking" }, new QueryArgument<RoomStatusType> { Name = "roomStatus" } }), resolve: context => { var query = reservationRepository.GetQuery(); var reservationId = context.GetArgument<int?>("id"); if (reservationId.HasValue) { return reservationRepository.GetQuery().Where(r => r.Id == reservationId.Value); } var checkinDate = context.GetArgument<DateTime?>("checkinDate"); if (checkinDate.HasValue) { return reservationRepository.GetQuery() .Where(r => r.CheckinDate.Date == checkinDate.Value.Date); } var checkoutDate = context.GetArgument<DateTime?>("checkoutDate"); if (checkoutDate.HasValue) { return reservationRepository.GetQuery() .Where(r => r.CheckoutDate.Date >= checkoutDate.Value.Date); } var allowedSmoking = context.GetArgument<bool?>("roomAllowedSmoking"); if (allowedSmoking.HasValue) { return reservationRepository.GetQuery() .Where(r => r.Room.AllowedSmoking == allowedSmoking.Value); } var roomStatus = context.GetArgument<RoomStatus?>("roomStatus"); if (roomStatus.HasValue) { return reservationRepository.GetQuery().Where(r => r.Room.Status == roomStatus.Value); } return query.ToList(); } ); } } ``` Now you can filter your query by the following fields: id, checkinDate, checkoutDate, roomAllowedSmoking, roomStatus. In the below GraphQL query we filter reservations by room.allowedSmoking boolean field and room.status enum. ```GraphQL query TestQuery { reservations (roomAllowedSmoking:false, roomStatus: AVAILABLE){ id checkinDate checkoutDate guest { id name registerDate } room { id name number allowedSmoking status } } } ``` 6- Adding Authorization GraphQL middle-ware is a new layer that doesn’t depend on MVC middle-ware. That’s why the request will not be authorized by MVC. So we have to authorize it by injecting *AuthorizationService* and check policies. To do this; Open *Startup.cs* and in the ConfigureServices method append the below code to *services.AddGraphQL()* ```csharp services.AddGraphQL(x => { x.ExposeExceptions = true; }) .AddGraphTypes(ServiceLifetime.Scoped) .AddUserContextBuilder(httpContext => httpContext.User) .AddDataLoader(); ``` Then you can get the user identity within the resolve code block of your query ```csharp public MyHotelQuery(ReservationRepository reservationRepository) { Field<ListGraphType<ReservationType>>("reservations", arguments: new QueryArguments(new List<QueryArgument> { //............ }), resolve: context => { var user = (ClaimsPrincipal)context.UserContext; var isUserAuthenticated = ((ClaimsIdentity) user.Identity).IsAuthenticated; //............ } ); } ``` 7- Interfaces ```csharp public class ReservationInterface : InterfaceGraphType<Reservation> { public ReservationInterface() { Name = "Reservation"; Field(x => x.Id); Field(x => x.CheckinDate).Description("The first day of the stay"); Field(x => x.CheckoutDate).Description("The leaving day"); Field<GuestType>(nameof(Reservation.Guest)); Field<RoomType>(nameof(Reservation.Room)); } } public class ReservationType : ObjectGraphType<LuggageReservation> { public ReservationType() { Name = "LuggageReservation"; Interface<ReservationInterface>(); } } public class LuggageReservation : Reservation { } ``` 8- Consuming GraphQL API Sending a query via an HTTP request is simple! You send a get request to the below URL including your query which you use on the GraphQL Playground page. ```GraphQL http://*domain*/graphql?query= ``` Let’s make a simple query to get the list of reservations with id, checkinDate and checkoutDate Here’s our query: ```GraphQL query TestQuery { reservations { id checkinDate checkoutDate } } ``` To send this query as a HTTP request, remove the first 2 words: **query** and **TestQuery** then flatten the string into one line as below: ```GraphQL {reservations {id checkinDate checkoutDate}} ``` Now we append this query string to our graphql URL: ```GraphQL https://localhost:44349/graphql?query={reservations {id checkinDate checkoutDate}} ``` **When you go to this URL on your browser, you’ll see the response as JSON object. Simple and straight-forward!** ```json { "data":{ "reservations":[ { "id":1, "checkinDate":"2019-02-05", "checkoutDate":"2019-02-18" }, { "id":2, "checkinDate":"2019-02-10", "checkoutDate":"2019-02-20" } ] } } ``` > Write your queries with the help of GraphQL Playground and then use it in your code. 8.1 — Sending Multiple Queries In a Single Request To send multiple queries, you can use *aliases* in the query. In the following query we assign names to each query as reservation_1 and reservation_2. ```GraphQL { reservation_1: reservations(id: 1) { id checkinDate checkoutDate } reservation_2: reservations(id: 2) { id checkinDate checkoutDate } } ``` And we successfully get 2 results in a single response ```json { "data": { "reservation_1": [ { "id": 1, "checkinDate": "2019-02-05", "checkoutDate": "2019-02-18" } ], "reservation_2": [ { "id": 2, "checkinDate": "2019-02-10", "checkoutDate": "2019-02-20" } ] } } ``` 8.2 — Reducing Repetition of Fields in a Query In the previous sample, there were 2 queries asking for the same fields of reservation. To reduce this field repetitions we can use **Fragments.** We already covered Fragments in previous topics. See how we do it: ```GraphQL { reservation_1: reservations(id: 1) { ...reservationFields }reservation_2: reservations(id: 2) { ...reservationFields } }fragment reservationFields on ReservationType { id checkinDate checkoutDate } ``` 8.3 —Running a Single Query Among Multiple Queries Sometimes it needs to be run a single query among the other queries. To do this, we use named queries. You see 2 queries below: **all_reservations** and **selected_reservation.** ```GraphQL query all_reservations { reservations { id checkinDate checkoutDate } }query selected_reservation { reservations(id: 1) { id guest { name } room { name } } } ``` To run one of these queries in a URL simply use **operationName** in your GraphQL request. In the below URL, it’ll run **selected_reservation** query. ```GraphQL https://localhost:44349/graphql?query=query all_reservations {reservations {id checkinDate checkoutDate}} query selected_reservation {reservations(id: 1) {id guest{name} room{name}}}&operationName=selected_reservation ``` And the response of **selected_reservation** query: ```json { "data": { "reservations": [{ "id": 1, "guest": { "name": "Alper Ebicoglu" }, "room": { "name": "white-room" } }] } } ``` > You can use “[tools.knowledgewalls.com](http://tools.knowledgewalls.com/jsontoquerystring)” to convert your GraphQL JSON Query to URL Query String! 8.4 — Using Variables in a HTTP Request We learned the usage of variables in a query as below.  8.5 — Directives GraphQL directives have been created to deal with the problem of recurring tasks during the implementation. In the below sample, we see “**include”** directive is used. This directive gets an argument “**if”** which is a Boolean value. We created a variable called “**includeGuest**” and if it’s *true* it includes *guest* to the response.  And there’s the opposite one for include called “**skip”**. It behaves as the opposite way of “**include”**. We set the “**skip”** value to *false* so so it’s included.  8.6 —Sending Error From the Host If you want to send your custom error message to the client, you can add a new **ExecutionError** in the **context.Errors** list of the **resolve** method in your query. See the below sample, we check the reservationId, if it’s lower than zero, then we add  ```csharp public class MyHotelQuery : ObjectGraphType { public MyHotelQuery(ReservationRepository reservationRepository) { Field<ListGraphType<ReservationType>>("reservations", arguments: new QueryArguments(new List<QueryArgument> { //.... }), resolve: context => { var query = reservationRepository.GetQuery(); //... var reservationId = context.GetArgument<int?>("id"); if (reservationId.HasValue) { if (reservationId.HasValue) { if (reservationId.Value <= 0) { context.Errors.Add(new ExecutionError("reservationId must be greater than zero!")); return new List<Reservation>(); } return reservationRepository.GetQuery().Where(r => r.Id == reservationId.Value); } //... } ); } } ``` 9 — Consuming Your GraphQL API from C# 9.1 — Consuming GraphQL API With a Simple C# Http Client To consume your GraphQL in a C# application you can create a simple class to map the data from host. Here you see my custom class for that; ```csharp public class Response<T> { public T Data { get; set; } public List<ErrorModel> Errors { get; set; } public void OnErrorThrowException() { if (Errors != null && Errors.Any()) { throw new ApplicationException($"Message: {Errors[0].Message} Code: {Errors[0].Code}"); } } } public class ErrorModel { public string Message { get; set; } public string Code { get; set; } } ``` And this a basic client to transfer data with GraphQL. We get a JSON from GraphQL endpoint. Then we deserialize it to our custom **Response** class. ```csharp public class MyHotelGraphqlClient { public const string GraphqlAddress = "https://localhost:44349/graphql/"; private readonly HttpClient _httpClient; public MyHotelGraphqlClient(HttpClient httpClient) { _httpClient = httpClient; } public async Task<Response<ReservationContainer>> GetReservationsAsync() { var response = await _httpClient.GetAsync( @"?query={ reservations { checkinDate guest { name } room { name } } }"); var stringResult = await response.Content.ReadAsStringAsync(); return JsonConvert.DeserializeObject<Response<ReservationContainer>>(stringResult); } } ``` To be able to inject this client, you need to register it in the **Startup.cs** ```csharp public void ConfigureServices(IServiceCollection services) { //we add this after services.AddMvc() services.AddHttpClient<MyHotelGraphqlClient>(x => x.BaseAddress = new Uri(MyHotelGraphqlClient.GraphqlAddress));//........ } ``` And now it’s ready to be used. In **MyHotelGraphqlClient** with the name _myHotelGraphClient and use it; ```csharp var response = await _myHotelGraphqlClient.GetReservationsAsync(); response.OnErrorThrowException(); var reservations = response.Data.Reservations; ``` 9.2 — Consuming GraphQL API With GraphQl.Client Library **GraphQl.Client** is a .NET Standard library that helps to make request to GraphQL API. It’s open-source. [Here’s the GitHub Repo](https://github.com/graphql-dotnet/graphql-client). [GraphQL.Client 1.0.3A GraphQL Clientwww.nuget.org](https://www.nuget.org/packages/GraphQL.Client) Create a class and name it **ReservationGraphqlClient**. This is our client that will use Graphql.Client to fetch the reservations data. Inject **GraphQLClient** which is in the *GraphQL.Client* namespace. Create an **GraphQLRequest** and set it’s **Query** property. Then post it via **GraphQLClient.** Incase you get any error check the **Errors** property of the response. You can retrieve the reservations from **GetDataFieldAs()** method. ```csharp public class ReservationGraphqlClient { private readonly GraphQLClient _client; public ReservationGraphqlClient(GraphQLClient client) { _client = client; } public async Task<List<ReservationModel>> GetReservationsAsync() { var query = new GraphQLRequest { Query = @" query reservation { reservations { checkinDate guest { name } room { name } } } " }; var response = await _client.PostAsync(query); if (response.Errors != null && response.Errors.Any()) { throw new ApplicationException(response.Errors[0].Message); } var reservations = response.GetDataFieldAs<List<ReservationModel>>("reservations"); return reservations; } } ``` To be able to inject **GraphQLClient** and **ReservationGraphqlClient**, register in them **Startup** as *singleton* dependencies. ```csharp public void ConfigureServices(IServiceCollection services) { //Add it after Mvc middleware. services.AddSingleton(t => new GraphQLClient(Configuration["GraphQlEndpoint"])); services.AddSingleton<ReservationGraphqlClient>(); //... } ``` 10 — Consuming Your GraphQL API 10.1— Fetching Data Using Angular2 To make HTTP requests you need to inject HttpClient. Then make a GET request to [*http://\*domain\*/graphql/*](http://*domain*/reservations/list)*.* Add your query to query parameter of your URL. ```csharp export class FetchDataComponent { public reservations: Reservation[]; constructor(http: HttpClient, @Inject('BASE_URL') baseUrl: string) { this.fetchDirectlyFromGraphQL = function () { var query = `?query= { reservations { checkinDate guest { name } room { name } } }` ; http.get<Reservation[]>(baseUrl + 'graphql/' + query).subscribe(result => { this.reservations = result.data.reservations; }, error => console.error(error)); } } } ``` 10.2 — Fetching Data Using JavaScript To fetch using JavaScript import the following library ```html <script src="https://unpkg.com/apollo-client-browser@1.9.0"></script> ``` Use **Fetch** command to send the request: ```javascript fetch('/graphql', { method: 'POST', headers: { 'Content-Type': 'application/json', 'Accept': 'application/json', }, body: JSON.stringify({ query: 'query reservation {reservations {checkinDate guest {name} room {name}}}' }) }) .then(r => r.json()) .then(response => { this.reservations = response.data.reservations; }); ``` 10.3 — Using Apollo Client There’s a rich client that helps us to fetch data with GraphQL. [Apollo](https://www.apollographql.com/) Client is the ultra-flexible, community driven GraphQL client for Angular, JavaScript, and native platforms. It is designed from the ground up to make it easy to build UI components that fetch data with GraphQL. It’s open-source but also there’re paid plans. - See the docs [https://www.apollographql.com/docs/angular/](https://www.apollographql.com/docs/angular/) - See the GitHub Repo [https://github.com/apollographql/apollo-client](https://github.com/apollographql/apollo-client) - See how to use it in Angular2 [https://gearheart.io/blog/how-to-use-graphql-with-angular-2-with-example/](https://gearheart.io/blog/how-to-use-graphql-with-angular-2-with-example/) Here’s how to query using Apollo Client: ```GraphQL var client = new Apollo.lib.ApolloClient( { networkInterface: Apollo.lib.createNetworkInterface({ uri: "https://localhost:44349/graphql" }) }); const query = Apollo.gql` query reservation { reservations { checkinDate guest { name } room { name } } }`; client.query({ query: query }).then(result => { this.reservations = result.data.reservations; }); ``` Finally all the code mentioned here can be accessible on GitHub. [https://github.com/ebicoglu/AspNetCoreGraphQL-MyHotel](https://github.com/ebicoglu/AspNetCoreGraphQL-MyHotel) * * * ## Read More: 1. [How to Use Attribute Directives to Avoid Repetition in Angular Templates](https://volosoft.com/blog/attribute-directives-to-avoid-repetition-in-angular-templates) 2. [ASP.NET Core 3.1 Webhook Implementation Using Pub/Sub Pattern](https://volosoft.com/blog/ASP.NET-CORE-3.1-Webhook-Implementation-Using-Pub-Sub) 3. [Why You Should Prefer Singleton Pattern over a Static Class?](https://volosoft.com/blog/Prefer-Singleton-Pattern-over-Static-Class)
**Event date:** 29 January — 01 February 2019 **Location:** QEII Centre / London Last week we exhibited at [NDC London 2019](https://ndc-london.com/) and had such a fantastic time. 105 speakers, 36 different technologies and 111 talks…  <center> Sessions </center> For us it’s a very important conference because a lot of precious speakers attend to this conference. Many of the speakers and attendees are from Microsoft stack and mainly hands on coding in .NET. That’s why we love to sponsor the event every year. As a developer, it was nice to meet Scott Hanselman and Jon Galloway. We had an opportunity to show them our new web application framework [abp.io](https://abp.io/). Especially, Scott liked our new tag helpers and dynamic forms for ASP.NET Core.  <blockquote class="twitter-tweet"><p lang="en" dir="ltr">It’s very cool. Check out dynamic forms!</p>— Scott Hanselman (@shanselman) <a href="https://twitter.com/shanselman/status/1091642952445714432?ref_src=twsrc%5Etfw">February 2, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> <blockquote class="twitter-tweet"><p lang="en" dir="ltr">Thanks for the great demos! Excited to see what you're building for <a href="https://twitter.com/aspboilerplate?ref_src=twsrc%5Etfw">@aspboilerplate</a>!</p>— Jon Galloway (@jongalloway) <a href="https://twitter.com/jongalloway/status/1090978947930771461?ref_src=twsrc%5Etfw">January 31, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> We attended to Jon Galloway’s talk, called “*ASP.NET Core the One Hour Makeover*”. And it was very exciting to see [ASP.NET Boilerplate](https://aspnetboilerplate.com/) in the slides as a 3rd party solution. This is the link for Jon Galloway’s talk notes [https://gist.github.com/jongalloway/70e5373837534abe6c89e7ab3ec4efb5](https://gist.github.com/jongalloway/70e5373837534abe6c89e7ab3ec4efb5)  <center> Jon Galloway’s talk </center> Also, we met with Jon Skeet and Troy Hunt. Personally, I’m fascinated by how Troy Hunt raised his career ladder and became a security advisor of the world.  <center> Upper-left John Skeet; Upper-right Jon Galloway, Lower-left Scott Hanselman, Lower-right Troy Hunt </center> There were really big companies like Microsoft, Google, DevExpress, JetBrains, Twilio etc. We were the only company who presents an open-source & free software: [AspNet Boilerplate](https://aspnetboilerplate.com/). We had nice conversations with developers from the different parts of the world and discussed about software topics.  <center> Exhibition area </center>  <center> Volosoft’s Booth </center> When it comes to swags and freebies, I guess Volosoft has raised the bar so high. We brought there really high-class swags like wireless earphones, smart watches, hand-made chocolates and a brand-new MacBook Air. We saw people do like hand-made chocolates very much 😊 Also we had some funny stickers for developers. In the second day, we drew a raffle for the Mac Book Air and the new Mac Book went to lucky [Ehab ElGindy](https://twitter.com/ehabelgindy).  <center> Swags </center> <blockquote class="twitter-tweet"><p lang="en" dir="ltr">Hands down the best freebies I've ever received at a conference! <a href="https://twitter.com/volosoftcompany?ref_src=twsrc%5Etfw">@volosoftcompany</a> <a href="https://twitter.com/NDC_Conferences?ref_src=twsrc%5Etfw">@NDC_Conferences</a> <a href="https://t.co/8gR1SRaCPO">pic.twitter.com/8gR1SRaCPO</a></p>— Jon Leigh (@iamjonleigh) <a href="https://twitter.com/iamjonleigh/status/1090965006739537923?ref_src=twsrc%5Etfw">January 31, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> It was a very satisfactory conference as an exhibitor, thanks to everyone and hope to see you again in London next year!
# Problem Let’s start with an example to understand problem. Check the following code: <center> <img src="https://miro.medium.com/max/1206/1*GtvvoQsWrU-BjhRk-3qSww.png"> </center> <center> API method that can be returned 404 that is not documented </center> Above API is looking in swagger-ui like following:  <center> Undocumented API method </center> As you can see, there is no information that 404 can return, even though method has a `NotFound()` method. We can only see info that we request to method with id that is not exist.  <center> 404 with undocumented </center> When we request to API with not existing product, it returns 404 with un documented warning. ## Detect Problem: Analyzers Analyzers work with controllers annotated with `ApiController` introduced in ASP.NET Core 2.1.0, while building on API conventions that is released in ASP.NET Core 2.2.0. To start using this, install the package: ```csharp Microsoft.AspNetCore.Mvc.Api.Analyzers ``` You should see this as warnings (squiggly lines) highlighting return types that aren’t documented as well as warnings in the output. In Visual Studio, this should additionally appear under the “Warnings” tab in the “Error List” dialog. You now have the opportunity to address these warnings using code fixes. Let’s look at the analyzer in action. Before adding analyzer:  <center> Before adding analyzer </center> After adding analyzer:  <center> After adding analyzer </center> The analyzer identified that the action returned a `404` but did not document it using a `ProducesResponseTypeAttribute`. It’s a great way to identify areas of your application that are lacking swagger documentation and correct it. ## Fix the Problem Adding `ProducesResponseTypeAttribute` will fix the problem. Check the code:  <center> Adding `ProducesResponseTypeAttribute` will fix the problem </center> After adding above attribute, let’s check swagger-ui again:  <center> swagger-ui </center> We know the method can return 404 response. It is documented now. # Conventions > **From MSDN:** If your controllers follows some common patterns, e.g. they are all primarily CRUD endpoints, and you aren’t already using `ProducesResponseType` or `Produces` to document them, you could consider using API conventions. Conventions let you define the most common “conventional” return types and status codes that you return from your action, and apply them to individual actions or controllers, or all controllers in an assembly. Conventions are a substitute to decorating individual actions with `ProducesResponseType` attributes. > > By default, ASP.NET Core MVC 2.2 ships with a set of default conventions — `DefaultApiConventions` – that’s based on the controller that ASP.NET Core scaffolds. If your actions follow the pattern that scaffolding produces, you should be successful using the default conventions. There are 3 ways to apply a convention to a controller action: - Applying the `ApiConventionType` attribute as an assembly level attribute. This applies the specified convention to all controllers in an assembly.  - Using the `ApiConventionType` attribute on a controller.  - Using `ApiConventionMethod`. This attributes accepts both the type and the convention method.  ## Authoring conventions You can write your own conventions. Define a static class to define rules.  And apply this to your controller or action.  And result:  > An easy way to get started authoring a custom convention is to start by copying the body of `DefaultApiConventions` and modifying it. Here’s a link to the source of the type: [https://raw.githubusercontent.com/aspnet/Mvc/release/2.2/src/Microsoft.AspNetCore.Mvc.Core/DefaultApiConventions.cs](https://raw.githubusercontent.com/aspnet/Mvc/release/2.2/src/Microsoft.AspNetCore.Mvc.Core/DefaultApiConventions.cs) Here is the MSDN article about this topic: [https://blogs.msdn.microsoft.com/webdev/2018/08/23/asp-net-core-2-20-preview1-open-api-analyzers-conventions/](https://blogs.msdn.microsoft.com/webdev/2018/08/23/asp-net-core-2-20-preview1-open-api-analyzers-conventions/) Sorce code of examples: [https://github.com/alirizaadiyahsi/NetCore22Demos](https://github.com/alirizaadiyahsi/NetCore22Demos)
# Introduction We have recently moved our [ASP.NET ZERO](https://web.archive.org/web/20180412124653/https://www.aspnetzero.com/) solution from **ASP.NET MVC 5.2.3** to **ASP.NET Core 1.0**. In this post, I will share my experiences and explain mechanics of this migration in brief. Notice that: I didn’t convert the solution to .NET Core, I just moved to ASP.NET Core on top of full **.NET Framework 4.6.1.** # Solution Structure I decided to use new **.xproj/project.json** format instead of old .csproj format for the solution. Although Microsoft [announced](https://blogs.msdn.microsoft.com/dotnet/2016/05/23/changes-to-project-json/) that .xproj/project.json format will also be changed and returning back to the **.csproj** format, actually new .csproj format will be different than the old one. So, I decided to use the latest format (as Microsoft also does the same for ASP.NET Core platform) and migrate to the new .csproj format when the time comes. There are pros and cons of project.json format, but I will not go to all details. The most big problem is **it’s not well documented yet**. I found some partial documents/articles and most of them was out-dated. So, “try and see” approach was my best friend in some cases. Old solution structure was like that:  New solution structure is shown below:  .WebApi project has gone and solution is seperated to src & test folders (default convention for new solution structure). Obviously, I created a new solution and empty projects inside it. Then copied all files into this solution manually. All layers (including test project) except the Web layer are successfully compiled in new project format without any change. ## Nuget Packages After moving code to new solution folder, I also added needed nuget package references to project.json. For example, my project.json file for new .Core project is like that: ```json { "version": "1.0.0.0-*", "dependencies": { "Abp": "0.11.2", "Abp.AutoMapper": "0.11.2", "Abp.Zero": "0.11.2", "Abp.Zero.Ldap": "0.11.2", "Microsoft.Extensions.Configuration.EnvironmentVariables": "1.0.0", "Microsoft.Extensions.Configuration.Json": "1.0.0", "Microsoft.Extensions.Configuration.UserSecrets": "1.0.0" }, "frameworks": { "net461": { } }, "buildOptions": { "embed": { "include": [ "Localization/PhoneBook/*.xml", "Emailing/EmailTemplates/default.html" ] } } } ``` ## Embedded Resources Previously, we were just changing Build Action to Embedded Resource to [embed file into an assembly](https://web.archive.org/web/20180412124653/https://support.microsoft.com/en-us/kb/319292#bookmark-4). With the new xproj format, we should do it in project.json as shown above. I added all xml files in Localization/AbpZeroTemplate and default.html as embedded resource. Consuming an embedded resource hasn’t changed. # ASP.NET MVC There was not code change for class library projects (except the project structure and dependencies). But when it comes to ASP.NET Core, we have much things to talk about. ## Including Styles & Scripts To Views This was the most time consuming and tedious part of the migration. Since we are using many client side libraries and we have many razor views, I manually changed all views to include related styles and scripts. An example css include approach in ASP.NET Core: ```xml <environment names="Development"> <link rel="stylesheet" href="~/view-resources/Areas/AppAreaName/Views/Users/Index.css" asp-append-version="true" /> </environment><environment names="Staging,Production"> <link rel="stylesheet" href="~/view-resources/Areas/AppAreaName/Views/Users/Index.min.css" asp-append-version="true" /> </environment> ``` We are including un-minified version in **development**, and including minified version in **production**. Also, notice to the **asp-append-version=”true”** attribute, which saves us from caching styles/scripts by browsers, even it development time. ## Script/Style Bundling & Minifiying In the previous ASP.NET MVC, we were using Microsoft [ASP.NET Web Optimization](https://web.archive.org/web/20180412124653/https://www.nuget.org/packages/Microsoft.AspNet.Web.Optimization/) library to accomplish this. I used new [**bunlder & minifier**](https://web.archive.org/web/20180412124653/https://visualstudiogallery.msdn.microsoft.com/9ec27da7-e24b-4d56-8064-fd7e88ac1c40) Visual Studio extension for minifying css/js files and bundling them when needed. Using this tool is much easy but another time consuming work. I created some bundled css/js files for my layout pages to reduce included css/js file count. Also, used minifying for all css/js files in individual razor views. We are also using [**less**](https://web.archive.org/web/20180412124653/http://lesscss.org/) for writing CSS. I used [**Web Compiler**](https://web.archive.org/web/20180412124653/https://visualstudiogallery.msdn.microsoft.com/3b329021-cd7a-4a01-86fc-714c2d05bb6c) Visual Studio extension to compile less to CSS (and minify them automatically). ## Bye Bye @helper We were using **@helper** to write some simple functions rendering HTML in a razor view in ASP.NET 5.x. In new ASP.NET Core MVC, there is no @helper. So, I converted @helper blocks to **partial views** with converting @html parameters to view models. ## Bye Bye Child Actions ASP.NET Core introduced [**view components**](https://web.archive.org/web/20180412124653/https://docs.asp.net/en/latest/mvc/views/view-components.html), which can be used instead of child actions. So, we converted all of child actions to components. ## Switch View Base to RazorPage Previously, we were creating view base classes deriving from WebViewPage. Now, we should derive it from RazorPage class. So, we changed our base view class. Also, we can now [inject dependencies](https://web.archive.org/web/20180412124653/https://docs.asp.net/en/latest/mvc/views/dependency-injection.html) to razor views. So, I also changed static calls to injections. ## Bye Bye Web API ASP.NET MVC and Web API frameworks are unified in new ASP.NET Core framework and there is only ASP.NET MVC Controller anymore. So, I changed my Web API Controllers to MVC Controllers by following new rules. Fortunately, I hadn’t much Web API Controllers, thanks to ABP’s [dynamic Web API layer](https://web.archive.org/web/20180412124653/http://www.aspnetboilerplate.com/Pages/Documents/Dynamic-Web-API), which is also [implemented](https://web.archive.org/web/20180412124653/http://www.aspnetboilerplate.com/Pages/Documents/AspNet-Core#application-services-as-controllers) for ASP.NET Core. So, I just changed a few configuration lines to move all my application services from Web API Controllers to ASP.NET Core Controllers. ## Entity Framework Migration Problems We were using Entity Framework 6.x in our previous solution and we have decided to use EF 6.x in our new solution instead of migrating to EF Core 1.0. Some reasons of that decision were: - EF Core still has some missing features (like lazy loading, seed data and [others](https://web.archive.org/web/20180412124653/https://docs.efproject.net/en/latest/efcore-vs-ef6/features.html#features-not-in-ef-core)) which we (and our customers) are using. - EF Core has some missing interception/extend points. Therefore, we can not use a consistent automatic [data filtering](https://web.archive.org/web/20180412124653/http://www.aspnetboilerplate.com/Pages/Documents/Data-Filters) tool (like [EF.DynamicFilters](https://web.archive.org/web/20180412124653/https://github.com/jcachat/EntityFramework.DynamicFilters/issues/48)) for [multi tenancy](https://web.archive.org/web/20180412124653/http://www.aspnetboilerplate.com/Pages/Documents/Multi-Tenancy)and soft delete. So, we decided to move our existing EF 6.x layer (and migrations) into new solution format, but it just didn’t work (before of embedded resource changes and other reasons). We found a tool, [Migrator.EF](https://web.archive.org/web/20180412124653/https://github.com/mrahhal/Migrator.EF6), which allows us to add/apply migrations just like EF Core. We don’t use Package Manager Console (PMC) commands like “Update-Database”, but we are using Windows Command Prompt commands like “dotnet ef database update”. Migrator.EF does good job. But what about our existing Migration classes? We manually changed their format to properly work in new tool. Assume that we have a migration like that: ```csharp public partial class Added_ConnString_To_Tenant_Entity : DbMigration { public override void Up() { AddColumn("dbo.AbpTenants", "ConnectionString", c => c.String(maxLength: 1024)); } public override void Down() { DropColumn("dbo.AbpTenants", "ConnectionString"); } } ``` This class has no need to be changed. But.. notice that it’s a partail class. It’s other part (.Designer.cs) is like that: ```csharp [GeneratedCode("EntityFramework.Migrations", "6.1.3-40302")] public sealed partial class Added_ConnString_To_Tenant_Entity : IMigrationMetadata { private readonly ResourceManager Resources = new ResourceManager(typeof(Added_ConnString_To_Tenant_Entity)); string IMigrationMetadata.Id { get { return "201604201100303_Added_ConnString_To_Tenant_Entity"; } } string IMigrationMetadata.Source { get { return null; } } string IMigrationMetadata.Target { get { return Resources.GetString("Target"); } } } ``` As you see, it uses resource manager to get some metadata (model snapshot) from a resx file. We removed resource manager usage, and changed the Designer partial class like that: ```csharp [GeneratedCode("EntityFramework.Migrations", "6.1.3-40302")] public sealed partial class Added_ConnString_To_Tenant_Entity : IMigrationMetadata { string IMigrationMetadata.Id { get { return "201604201100303_Added_ConnString_To_Tenant_Entity"; } } string IMigrationMetadata.Source { get { return null; } } string IMigrationMetadata.Target { get { return "H4s..........AQA="; } } } ``` Now, Target property returns the hash value instead of getting it from resource manager. To do that, you should open the .resx file and copy hash to here. Then we could able to run migrator tool. ## Model Binding Changes ASP.NET Core has similar model binding approach with the previous version but also has some differences. For example, see the action below: ```csharp public JsonResult SwitchToLinkedAccount(SwitchToLinkedAccountModel model) { //.... } ``` This model binding will not work if your client requested with a JSON in HTTP body. We should explicitly add [FromBody] attribute to the parameter: ```csharp public JsonResult SwitchToLinkedAccount([FromBody] SwitchToLinkedAccountModel model) { //.... } ``` Read the [model binding document](https://web.archive.org/web/20180412124653/https://docs.asp.net/en/latest/mvc/models/model-binding.html) to fully understand new model binding approach. ## wwwroot Folder Now, we have a wwwroot folder which is actually the root folder for our static files (like js, css and image files for our web application). I moved all such files into this new folder. ## Client Side Dependencies We were mostly used nuget for js/css libraries in previous ASP.NET, and we couldn’t find all libraries and their latest versions in the nuget. Nuget has also other problems with client libraries containing static files. Fortunately, now bower is fully integrated to Visual Studio 2015. So, I added all libraries with bower. One more good thing; Nuget does not add package files into the solution anymore and uses from a central directory in the local computer, which is much more performant and does not add unnecessary files into our solution folder. ## Startup File ASP.NET Core is initialized from the **Startup** class in the application. We configure all libraries (including ABP) in this class. So, I removed old **global.asax** and configured fundamental libraries in Startup class. ## Social Logins Since authentication infrastructure is completely changed, we should not use OWIN based social login middlewares anymore. Instead, I moved my code to new social auth packages based on [this documentation](https://web.archive.org/web/20180412124653/https://docs.asp.net/en/latest/security/authentication/sociallogins.html). ## SignalR & OWIN Integration As you probably know, there is no SignalR version for ASP.NET Core yet (see [road map](https://web.archive.org/web/20180412124653/https://github.com/aspnet/Home/wiki/Roadmap)). Current SignalR (2.x) works on OWIN. While ASP.NET Core has [OWIN support](https://web.archive.org/web/20180412124653/https://docs.asp.net/en/latest/fundamentals/owin.html), it was not easily to properly integrate SignalR to ASP.NET Core pipeline. Fortunately, I did it. First step is to make OWIN pipeline properly integrated to ASP.NET Core middleware pipeline. I created an extension method based on [some code](https://web.archive.org/web/20180412124653/https://code.msdn.microsoft.com/The-ASPNET-vNext-Real-Time-b1d27fe4/sourcecode?fileId=155043&pathId=1111143726) I found: ```csharp public static class BuilderExtensions { public static IApplicationBuilder UseAppBuilder( this IApplicationBuilder app, Action<IAppBuilder> configure) { app.UseOwin(addToPipeline => { addToPipeline(next => { var appBuilder = new AppBuilder(); appBuilder.Properties["builder.DefaultApp"] = next; configure(appBuilder); return appBuilder.Build<Func<IDictionary<string, object>, Task>>(); }); }); return app; } } ``` Then I can use MapSignalR method to add SignalR to the pipeline as shown below: ```csharp app.UseAppBuilder(appBuilder => { appBuilder.Properties["host.AppName"] = "MyProjectName"; appBuilder.MapSignalR(); }); ``` ## User Secrets ASP.NET Core has a nice tool to store sensitive configuration values out of the solution folder. Thus, our secret or custom configuration values (like passwords) remain in our local computer, instead of source control system. I used it to store Social Login API keys for example. ## Token Based Authentication In the previous ASP.NET, it was relatively easy to add a token based auth mechanism to our applications. It was including infrastructure to generate and validate tokens. In ASP.NET Core, there is no mechanism to generate tokens (but there is a package to validate it). So, I investigated much, tried different OpenId Connect Servers ([ASOS](https://web.archive.org/web/20180412124653/https://github.com/aspnet-contrib/AspNet.Security.OpenIdConnect.Server), [Identity Server 4](https://web.archive.org/web/20180412124653/https://identityserver4.readthedocs.io/)…) and finally decided to build my own, custom and simple token generator middleware based on [this article](https://web.archive.org/web/20180412124653/https://stormpath.com/blog/token-authentication-asp-net-core). ## Swagger & Swagger UI We previously were using [Swashbuckle](https://web.archive.org/web/20180412124653/https://github.com/domaindrivendev/Swashbuckle) to add Swagger API document generator for our application. Fortunately, they created [a version](https://web.archive.org/web/20180412124653/https://github.com/domaindrivendev/Ahoy) compatiple to ASP.NET Core and we easily switched to that new version (which is currently in beta).
# Introduction Creating a modular application is hard. **Building a modular User Interface is even harder**. You need to separately develop module pages, components but make them integrated and working together as like a monolithic application UI. Creating such a modular architecture requires to build a strong **infrastructure**. This is what we are trying to do with the open source [ABP framework](https://abp.io/) project. In this article, I will focus on the **Virtual File System**, an important part of the modular infrastructure and will explain why we need it and how it can be developed on top of ASP.NET Core MVC. <center><iframe width="560" height="315" src="https://www.youtube.com/embed/2cTHXC7wI6w" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></center> <center> Video presentation of this article </center> # User Interface Components A typical ASP.NET Core MVC web application UI consists of static and dynamic resources.  **Static resources** include JavaScript, CSS, Image… files. These resources are requested by browser and responded by the **Static Files** middleware. They are generally located under wwwroot folder of the application. **Dynamic resources** are Razor views, pages and components. They are handled, compiled and rendered by the **Razor View Engine**. Both static and dynamic files are normally located in the **physical file system** (while the latest ASP.NET Core has a pre-compile option, the main point is same). # User Interface Components in a Modular Application In a modular application, UI components are distributed into modules and generally **embedded** into module assemblies (DLL files).  The Static File Middleware and the Razor View Engine **can not** work with the resources distributed across module assemblies. # The Virtual File System The Virtual File System is an adapter (wrapper) that makes ASP.NET Core work with the resources located other than the physical file system.  Our virtual file system implementation can work with three type of file locations: - **Embedded Files**: The files located in a DLL as embedded resource. These resources are registered to the Virtual File System on the application startup. - **Physical Files**: The files located under the web application (wwwroot folder for static resources, root folder for views, pages… etc.). So, it’s backward compatible. - **Dynamic Files**: The files generated on runtime (such as dynamic js/css bundle files). Dynamic files can **override** physical files and physical files can **override** embedded files (if located in the same path). In this way, your application can override UI components (like CSS files, JS files or views) of a module for customization purpose. ## Virtual File Contribution Modules should register/add their own embedded resources to the Virtual File System on the application startup. We have created VirtualFileSystemOptions for that purpose. Example usage: ```csharp context.Services.Configure<VirtualFileSystemOptions>(options => { options .FileSets .AddEmbedded<MyModule>(); }); ``` This code adds all embedded resources in the assembly of the MyModule class to the Virtual File System (VFS). Once all modules contributed to the VFS, we have a list of files and their paths (namespaces of an embedded is converted to a path) in an in-memory dictionary/collection. ## IFileProvider Interface ASP.NET Core uses IFileProvider interface to read files from the file system: ```csharp public interface IFileProvider { IFileInfo GetFileInfo(string subpath); IDirectoryContents GetDirectoryContents(string subpath); IChangeToken Watch(string filter); } ``` - **GetFileInfo** method is used to read a file info and content from a given path. It returns **NotFoundFileInfo** if given file does not exists. - **GetDirectoryContents** method is used to get list of files and directories inside a directory. It returns **NotFoundDirectoryContents** if given directory does not exists (can return singleton instance: NotFoundDirectoryContents.Singleton). - **Watch** method is used to get notified when a file or folder changes in the given path. filter can contain wildcards (like ‘*’). We should implement this interface to return files from embedded/dynamic files. I will not share the implementation in this article, but you can guess it. If you want to know details, see our [implementation](https://github.com/abpframework/abp/tree/master/framework/src/Volo.Abp.VirtualFileSystem) and the [documentation](https://github.com/abpframework/abp/blob/master/docs/en/Virtual-File-System.md). ## Configure Razor View Engine Once we implement the Virtual File System, we can configure the **RazorViewEngineOptions** to add the new custom file provider: ```csharp context.Services.Configure<RazorViewEngineOptions>(options => { options.FileProviders.Insert(0, new MyVirtualFileProvider()); }); ``` ## Replace Static File Middleware We normally use **app.UseStaticFiles** in order to serve physical files to browsers. We should also replace it by the Virtual File Provider. This part is also very easy. We can create such an extension method: ```csharp public static void UseVirtualFiles(this IApplicationBuilder app) { app.UseStaticFiles( new StaticFileOptions { FileProvider = new MyVirtualFileProvider() } ); } ``` MyVirtualFileProvider is our example IFileProvider implementation. In your case, you can set FileProvider to any class that implements the IFileProvider interface as explained before. Finally, we can use the UseVirtualFiles method instead of UseStaticFiles: ```csharp app.UseVirtualFiles(); ``` # Conclusion I tried to briefly explain why we need to a Virtual File System to develop modular ASP.NET Core MVC applications and how to implement it. I planned to write more articles on modular application development on ASP.NET Core based on my [ABP framework](https://abp.io/) development experiences.
In this article, I will share my experiences and suggestions on using Dependency Injection in ASP.NET Core applications. The motivation behind these principles are; - Effectively designing services and their dependencies. - Preventing multi-threading issues. - Preventing memory-leaks. - Preventing potential bugs. This article assumes that you are already familiar with Dependency Injection and ASP.NET Core in a basic level. If not, please read the [ASP.NET Core Dependency Injection documentation](https://docs.microsoft.com/en-us/aspnet/core/fundamentals/dependency-injection?view=aspnetcore-2.1) first. # Basics ## Constructor Injection Constructor injection is used to declare and obtain dependencies of a service on the **service construction**. Example: ```csharp public class ProductService { private readonly IProductRepository _productRepository; public ProductService(IProductRepository productRepository) { _productRepository = productRepository; } public void Delete(int id) { _productRepository.Delete(id); } } ``` ProductService is injecting IProductRepository as a dependency in its constructor then using it inside the Delete method. **Good Practices:** - Define **required dependencies** explicitly in the service constructor. Thus, the service can not be constructed without its dependencies. - Assign injected dependency to a **read only** field/property (to prevent accidentally assigning another value to it inside a method). ## Property Injection ASP.NET Core’s [standard dependency injection container](https://www.nuget.org/packages/Microsoft.Extensions.DependencyInjection) **does not support property injection**. But [you can use](https://docs.microsoft.com/en-us/aspnet/core/fundamentals/dependency-injection?view=aspnetcore-2.1#replacing-the-default-services-container) another container supporting the property injection. Example: ```csharp using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging.Abstractions;namespace MyApp { public class ProductService { public ILogger<ProductService> Logger { get; set; } private readonly IProductRepository _productRepository; public ProductService(IProductRepository productRepository) { _productRepository = productRepository; Logger = NullLogger<ProductService>.Instance; } public void Delete(int id) { _productRepository.Delete(id); Logger.LogInformation( $"Deleted a product with id = {id}"); } } } ``` ProductService is declaring a Logger property with **public setter**. Dependency injection container can set the Logger if it is available (registered to DI container before). **Good Practices:** - Use property injection **only for optional dependencies**. That means your service can properly work **without** these dependencies provided. - Use [Null Object Pattern](https://en.wikipedia.org/wiki/Null_object_pattern) (as like in this example) if possible. Otherwise, always check for null while using the dependency. ## Service Locator Service locator pattern is another way of obtaining dependencies. Example: ```csharp public class ProductService { private readonly IProductRepository _productRepository; private readonly ILogger<ProductService> _logger; public ProductService(IServiceProvider serviceProvider) { _productRepository = serviceProvider .GetRequiredService<IProductRepository>(); _logger = serviceProvider .GetService<ILogger<ProductService>>() ?? NullLogger<ProductService>.Instance; } public void Delete(int id) { _productRepository.Delete(id); _logger.LogInformation($"Deleted a product with id = {id}"); } } ``` ProductService is injecting **IServiceProvider** and resolving dependencies using it. **GetRequiredService** throws exception if the requested dependency was not registered before. On the other hand, **GetService** just returns null in that case. When you resolve services inside the **constructor**, they are released when the service is released. So, you don’t care about releasing/disposing services resolved inside the constructor (just like constructor and property injection). **Good Practices:** - **Do not use** the service locator pattern wherever possible (if the service type is known in the development time). Because it makes the dependencies **implicit**. That means it’s not possible to see the dependencies easily while creating an instance of the service. This is especially important for **unit tests** where you may want to **mock** some dependencies of a service. - Resolve dependencies in the service **constructor** if possible. Resolving in a **service method** makes your application more complicated and error prone. I will cover the problems & solutions in the next sections. ## Service Life Times There are [three service lifetimes](https://docs.microsoft.com/en-us/aspnet/core/fundamentals/dependency-injection?view=aspnetcore-2.1#service-lifetimes-and-registration-options) in ASP.NET Core Dependency Injection: 1. **Transient** services are created every time they are injected or requested. 2. **Scoped** services are created per scope. In a web application, every web request creates a new separated service scope. That means scoped services are generally created per web request. 3. **Singleton** services are created per DI container. That generally means that they are created only one time per application and then used for whole the application life time. DI container keeps track of all resolved services. Services are released and disposed when their lifetime ends: - If the service has **dependencies**, they are also automatically released and disposed. - If the service implements the **IDisposable** interface, Dispose method is automatically called on service release. **Good Practices:** - Register your services as **transient** wherever possible. Because it’s simple to design transient services. You generally don’t care about **multi-threading** and **memory leaks** and you know the service has a short life. - Use **scoped** service lifetime **carefully** since it can be tricky if you create child service scopes or use these services from a non-web application. - Use **singleton** lifetime carefully since then you need to **deal** with **multi-threading** and potential **memory leak** problems. - **Do not depend** on a transient or scoped service from a singleton service. Because the transient service becomes a singleton instance when a singleton service injects it and that may cause problems if the transient service is not designed to support such a scenario. ASP.NET Core’s default DI container already throws **exceptions** in such cases. # Resolving Services in a Method Body In some cases, you may need to resolve another service in a method of your service. In such cases, ensure that you release the service after usage. The best way of ensuring that is to create a **service scope**. Example: ```csharp public class PriceCalculator { private readonly IServiceProvider _serviceProvider; public PriceCalculator(IServiceProvider serviceProvider) { _serviceProvider = serviceProvider; } public float Calculate(Product product, int count, Type taxStrategyServiceType) { using (var scope = _serviceProvider.CreateScope()) { var taxStrategy = (ITaxStrategy)scope.ServiceProvider .GetRequiredService(taxStrategyServiceType); var price = product.Price * count; return price + taxStrategy.CalculateTax(price); } } } ``` PriceCalculator injects the **IServiceProvider** in its constructor and assigns it to a field. PriceCalculator then uses it inside the Calculate method to create a **child service scope**. It uses **scope.ServiceProvider** to resolve services, instead of the injected _serviceProvider instance. Thus, all services resolved from the scope is automatically released/disposed at the end of the **using** statement. **Good Practices:** - If you are resolving a service in a method body, always create a **child service scope** to ensure that the resolved services are properly released. - If a method gets **IServiceProvider** as an argument, then you can directly resolve services from it without care about releasing/disposing. Creating/managing service scope is a responsibility of the code calling your method. Following this principle makes your code cleaner. - **Do not hold a reference to a resolved service**! Otherwise, it may cause memory leaks and you will access to a **disposed service** when you use the object reference later (unless the resolved service is singleton). # Singleton Services Singleton services are generally designed to keep an application state. A cache is a good example of application states. Example: ```csharp public class FileService { private readonly ConcurrentDictionary<string, byte[]> _cache; public FileService() { _cache = new ConcurrentDictionary<string, byte[]>(); } public byte[] GetFileContent(string filePath) { return _cache.GetOrAdd(filePath, _ => { return File.ReadAllBytes(filePath); }); } } ``` FileService simply caches file contents to reduce disk reads. This service should be registered as singleton. Otherwise, caching will not work as expected. **Good Practices:** - If the service holds a state, it should access to that state in a **thread-safe** manner. Because all requests concurrently uses the **same instance** of the service. I used **ConcurrentDictionary** instead of Dictionary to ensure thread safety. - **Do not use scoped or transient services** from singleton services. Because, transient services might not be designed to be thread safe. If you have to use them, then take care of multi-threading while using these services (use lock for instance). - **Memory leaks** are generally caused by singleton services. They are not released/disposed until the **end of the application**. So, if they instantiate classes (or inject) but not release/dispose them, they will also stay in the memory until the end of the application. Ensure that you **release/dispose** them at the right time. See the *Resolving Services in a Method Body* section above. - If you cache data (file contents in this example), you should create a mechanism to update/invalidate the cached data when the original data source changes (when a cached file changes on the disk for this example). # Scoped Services Scoped lifetime **first seems** a good candidate to store per web request data. Because ASP.NET Core creates a **service scope per web request**. So, if you register a service as scoped, it can be shared during a web request. Example: ```csharp public class RequestItemsService { private readonly Dictionary<string, object> _items; public RequestItemsService() { _items = new Dictionary<string, object>(); } public void Set(string name, object value) { _items[name] = value; } public object Get(string name) { return _items[name]; } } ``` If you register the RequestItemsService as scoped and inject it into two different services, then you can get an item that is added from another service because they will share the same RequestItemsService instance. That’s what we expect from scoped services. But.. the fact may not be always like that. If you create a **child service scope** and resolve the RequestItemsService from the child scope, then you will get a new instance of the RequestItemsService and it will not work as you expect. So, scoped service does not always means instance per web request. You may think that you do not make such an obvious mistake (resolving a scoped inside a child scope). But, this is not a mistake (a very regular usage) and the case may not be such simple. If there is a big dependency graph between your services, you can not know if anybody created a child scope and resolved a service that injects another service… that finally injects a scoped service. **Good Practice:** - A scoped service can be thought as an **optimization** where it is injected by too many services in a web request. Thus, all these services will use a single instance of the service during the same web request. - Scoped services don’t need to be designed as thread-safe. Because, they should be normally used by a single web-request/thread. But… in that case, you should **not share service scopes between different threads**! - Be careful if you design a scoped service to share data between other services in a **web request** (explained above). You can store per web request data inside the **HttpContext** (inject IHttpContextAccessor to access it) which is the safer way of doing that. HttpContext’s lifetime is not scoped. Actually, it’s not registered to DI at all (that’s why you don’t inject it, but inject IHttpContextAccessor instead). [HttpContextAccessor](https://github.com/aspnet/HttpAbstractions/blob/master/src/Microsoft.AspNetCore.Http/HttpContextAccessor.cs) implementation uses AsyncLocal to share the same HttpContext during a web request. # Conclusion Dependency injection seems simple to use at first, but there are potential multi-threading and memory leak problems if you don’t follow some strict principles. I shared some good principles based on my own experiences during development of the [ASP.NET Boilerplate](https://aspnetboilerplate.com/) framework.
Event date: 14th — 18th May 2018 Location: Barbican Centre | LONDON The [Software Design & Developer (SDD) Conference](http://sddconf.com/) is one of the famous developer conferences, held at the Barbican Centre in London. The conference consists of two days of workshops and three days of in-depth technical sessions. Plenty of refreshments and snacks to keep you going, a great lunch. We exhibited at SDD Conference and had a chance to speak with the developers from all around the world. Most of the developers were from .NET stack, so it was comfortable for us to speak about AspNet Boilerplate and AspNet Zero. We told about the benefits of the open-source AspNet Boilerplate and the production ready AspNet Zero product. Some folks have already seen us. We introduced them the new features and next plans. Some heard the products for the first time and they were very excited to start a new project with our products.  Attendees are 95% developers. And developer like gadgets. We handed out hundreds of free AspNet Zero branded Wi-Fi headphones and USB flash disks which made a good impression on the folks. Besides, we brought some really delicious handmade chocolates that some people even asked for their brand.  We had some free time to attend to some sessions. Each session was 1,5 hours long, so the presenter didn’t end cutting short his presentation! It was very hard to decide which sessions to sit in on and which to skip. Especially the talks of Scott Allen, Jimmy Bogard, Dino Esposito and Jon Skeet were super cool. Also the rest of the 40 world-class speakers very nice and very educational. For the details of SDD Conference visit [their website](https://sddconf.com/). We look forward to seeing you for the next conference. Stop by our booth and share your ideas with our lead developers. Thanks to all speakers and attendees for making it such an enjoyable week! [Click here](https://www.flickr.com/photos/124038428@N04/sets/72157691340139900) for the complete conference photos.
> **Web accessibility** refers to the inclusive practice of removing barriers that prevent interaction with, or access to websites, by people with disabilities. When sites are correctly designed, developed and edited, all users have equal access to information and functionality. ## Semantic HTML Use content-appropriate semantic html tags. For example; Use `button` tag for button functionality; ```html //don't <div>Click me!</div>//do <button>Click me</button> ``` Use `nav` element for menu instead of `div` ```html //don't <div class="menu"> <ul> ... </ul> </div>//do <nav class="menu"> <ul> ... </ul> </nav> ``` Use good content structure of headings, paragraphs, lists, etc. ```html <h1>My heading</h1> <p>This is the first section of my document.</p> <p>I'll add another paragraph here too.</p> <ol> <li>Here is</li> <li>a list for</li> <li>you to read</li> </ol> <h2>My subheading</h2> <p>This is the first subsection of my document. I'd love people to be able to find this content!</p> <h2>My 2nd subheading</h2> <p>This is the second subsection of my content. I think is more interesting than the last one.</p> ``` A heading is for heading not for text size or its bold. And also you should use heading tags in a certain order. (h1>h2>h3>…) ## Clear/Good Language Don’t use dashed. ```html //don't <button>print 1-5 pages</button>//do <button>print pages from 1 to 5</button> ``` Don’t use abbreviations. ```html //don't <p>Jan</p>//do <p>January</p> ``` Expand acronyms, at least once or twice. Instead of writing **HTML** in the first instance, write **Hypertext Markup Language** ## Declare the Page Language Declare the language in “html” markup like following. ```html <html lang="en"> … </html> ``` You can use the `lang` attribute, if you switch the language in the document. ```html <p>These words <b lang="tr">bu alan Türkçe kelimeler içeriyor</b> are wrote in Turkish</p> ``` ## Meaningful Links ```html //don't <ul> <li><a href="#">Click here</a></li> <li><a href="#">Read more..</a></li> <li>Buy tickets to Mars <a href="#">here</a></li> </ul>//do <ul> <li><a href="#">Find out more about the HTML language</a></li> <li>Read more about <a href="#">how to eat healthy</a></li> <li><a href="#">Buy tickets to Mars here</a></li> </ul> ``` And also it is the good practice to write meaningful text for title attribute. ```html <a href="https://www.w3schools.com/html/" title="Go to W3Schools HTML section">Visit our HTML Tutorial</a> ``` ## Alternative Text for Media Tags Alternative text are very important if media is not visible for users ( image/video content cannot be seen by visually-impaired people, and audio content cannot be heard by hearing-impaired people). ```html <img src="test_image.jpg" alt="Animals in Asia"> ``` ## ARIA Attributes **“Accessible Rich Internet Applications**”. It is a set of **attributes** to help enhance the semantics of a web site or web application to help assistive technologies, such as screen readers for the blind, make sense of certain things that are not native to HTML. For example, you can use `aria-role` and `aria-label` for navigation menu. ```html <nav aria-label="Mythical University"> <ul id="menubar1" role="menubar" aria-label="Mythical University"> <li> <a role="menuitem" aria-haspopup="true" aria-expanded="false" href="#" tabindex="0"> About </a> <ul role="menu" aria-label="About"> <li role="none"> <a role="menuitem" href="mb-about.html#overview" tabindex="-1"> Overview </a> </li> ... ``` Using `aria-label` attribute in an “icon only buttons” is the one of the good practice for accessibility. For example; ```html <button><i class="flaticon-search-1" aria-label="@L("Search")"></i></button> ``` ## Labeling Input Elements Use “for” attribute that matches the “id” of the input element. For example; ```html <label for="inputId"></label> <input type="text" id="inputId" /> ``` ------ Resources: - [Navigation Menubar Example](https://www.w3.org/TR/2017/NOTE-wai-aria-practices-1.1-20171214/examples/menubar/menubar-1/menubar-1.html) - [Writing HTML with accessibility in mind](https://medium.com/alistapart/writing-html-with-accessibility-in-mind-a62026493412) - [HTML: A good basis for accessibility](https://developer.mozilla.org/en-US/docs/Learn/Accessibility/HTML) - [HTML Accessibility](https://www.w3schools.com/html/html_accessibility.asp) - [Web accessibility](https://en.wikipedia.org/wiki/Web_accessibility)
Recently I updated all packages in my Xamarin project. And suddenly the project stopped building. So I read a bunch of docs on internet. And see that Xamarin has switched to the new *.csproj format. This upgrade has been done with Visual Studio 2017 but Xamarin was late to switch it. I searched for switching to the new format and found a couple of ways for achieving this problem. I saw a [StackOverflow post](https://stackoverflow.com/questions/42659684/how-to-upgrade-csproj-files-with-vs2017) to do this manually. Here’s the accepted answer.  I was looking something an automatic way of this. So I have not tried the above solution. Then I came across to this tool which looks a promising solution. [hvanbakel/CsprojToVs2017CsprojToVs2017 - Tooling for converting pre 2017 project to the new Visual Studio 2017 format.github.com](https://github.com/hvanbakel/CsprojToVs2017) I have not tried that solution as well because I found another amazing tool that automatically converts projects that are using packages.config or project.json to PackageReference. It’s made by a Microsoft employee. Click the below link to install the Visual Studio Extension. [https://marketplace.visualstudio.com/items?itemName=TaylorSouthwickMSFT.NuGetPackagetoProjectjsonConverter](https://marketplace.visualstudio.com/items?itemName=TaylorSouthwickMSFT.NuGetPackagetoProjectjsonConverter) After you install the extension, open your solution and right click the solution in `Solution Explorer` and click `Upgrade to Package References` <center> <img src="https://miro.medium.com/max/704/1*e8E9Pgjg11WvQimcqoDsRg.png"> </center> After selecting that, the project will be transformed as shown below. It is highly recommended that you perform this on a source-control enabled directory so you can easily undo if something goes wrong.  <center> <img src="https://miro.medium.com/max/950/1*33fGDjviIFOLNcBv7Pm0-Q.png"> </center> <center> Progress window while upgrading </center> <center> <img src="https://miro.medium.com/max/864/1*UP6vCKQCx2DhxpT-ZrViIw.png"> </center> <center> Conversation complete window </center> Thanks to Taylor Southwick for this neat and straight forward tool!
Sometimes it turns out that we have to delete all **bin** and **obj** folders recursively in a Visual Studio solution . There’s a built-in Visual Studio feature called *Clean Solution* but it doesn’t delete them all.  This is my way to for deleting all **BIN** and **OBJ** folders recursively. 1. Create an empty file and name it *DeleteBinObjFolders.bat* 2. Copy-paste code the below code into the *DeleteBinObjFolders.bat* 3. Move the *DeleteBinObjFolders.bat* file into the same folder with your solution (*.sln) file. ```bash @echo off @echo Deleting all BIN and OBJ folders… for /d /r . %%d in (bin,obj) do @if exist “%%d” rd /s/q “%%d” @echo BIN and OBJ folders successfully deleted :) Close the window. pause > nul ``` <center> <img src="https://miro.medium.com/max/1376/1*1NSu6VL7XS58KWPdXzVYIA.jpeg"> </center> <center> Deletes all BIN & OBJ folders with a double-click </center> I hope you will benefit. Happy coding!
It’s a quick tip that saves you to write “npm start” , every time you want to run your Angular project. So I created a Regedit entry that adds a new menu to the Windows context menu. When you right click to the Angular folder, you’ll see “**Run Angular**” menu item. By clicking this item your Angular project will start up.  # How to do this? In Windows, start **Regedit** (by clicking start > type Regedit) - Navigate to this path **Computer\HKEY_CLASSES_ROOT\Directory\shell\** - Right click to the **shell** folder and **New > Key** name it **Run Angular** - Then right click to the newly create **Run Angular** item and add **New > Key** again. Name it **command** - On the right pane, you’ll see **(Default).** Double click on it. And enter: **cmd /s /k npm start -prefix “%V”** # Alternatively If you want to directly import this key, you can download the zip file below. Extract the zip file, right click the **RunAngularContextMenu.reg** and click **Merge** [RunAngularContextMenu.zipRun Angular with Right Click RegEdit Exportdrive.google.com](https://drive.google.com/open?id=1gxuzGkPiGyHIl-69WgErAWN6UvME4MaW) *PS: This context menu item is not special for Angular. So you can see it on a non-angular folder as well.*
Securing a web application is crucial these days. When it comes to web developers, fixing the vulnerabilities should start from the first floor; from developer himself. You as a simple developer not really need to know all the hustle and bustle of pen testing. There are several good tools for scanning web applications. I will show you one of the easiest way to run a web penetration with the tool [OWASP ZAP](https://www.owasp.org/index.php/OWASP_Zed_Attack_Proxy_Project) (Zed Attack Proxy). # What is OWASP ZAP? OWASP (Open Web Application Security Project) is worldwide non-profit organization focused on improving the security of software.  OWASP ZAP (Zed Attack Proxy) is one of the world’s most popular security tool. It’s a part of OWASP community, that means it’s totally free. ## Why I choose OWASP ZAP? It is designed to be used by people with a wide range of security experience and as such is ideal for developers and functional testers who are new to penetration testing. ZAP is cross platform. What it does is to create a proxy between the client and your website. While you navigate thru all the features of your website, it captures all actions. Then it attacks your website with the known techniques. The good part is; **Yeap, it’s open-source!** At the time I visited their repository, the last commit was 40min ago. It’s a very active repository. Project was initiated in 2010 and still being improved. See the GitHub repository; [https://github.com/zaproxy/zaproxy](https://github.com/zaproxy/zaproxy).  <center> Image1: GitHub Repository of Owasp Zap </center> # Setting up your ZAP Environment - **JAVA 8+** : In order to install ZAP you need to install JAVA 8+ to your Windows or Linux system. If you use the Mac OS you don’t need JAVA as it’s already installed. Go to [https://java.com/en/download/](https://java.com/en/download/) and install it. - **Installer:** Download ZAP installer according to your OS [https://github.com/zaproxy/zaproxy/wiki/Downloads](https://github.com/zaproxy/zaproxy/wiki/Downloads) ## Starting OWASP ZAP After you install the application to the default directory, you can start clicking the OWASP ZAP icon on your Windows desktop. The default install directory; ```bash C:\Program Files\OWASP\Zed Attack Proxy\ZAP.exe ``` As it is a Java application, alternatively you can run the following command to start it. What it gives you extra configuration like scheduling your penetration test or starting with a particular URL. This is how you do it; ```bash java -Xmx512m -jar zap-2.7.0.jar ``` When you run the app, it asks you whether you want to save the session or not. If you want to reach your website configuration or test results later, you should save the session for later. For now let’s keep it default *“No,I do not want to persist the session”* <center> <img src="https://miro.medium.com/max/910/1*p5deaD3oyywvFx83fLqOZQ.png"> </center> <center> Image 2: Default Startup Dialog of Owasp Zap </center> # What Is the Difference Between Active & Passive Scan? ## What is passive scan? In terms of penetration test, a passive scan is a harmless test that looks only for the responses and checks them against known vulnerabilities. Passive scan doesn’t modify your website data. So it’s really safe for the websites that we don’t have permission. As you know OWASP number 1 vulnerability in 2018 is still **Injection**. And be aware that you can not detect even a SQL Injection with passive scan. ## What is active scan? Active scan, attacks the website using known techniques to find vulnerabilities. Active scan does modify data and can insert malicious scripts to the website. So when you really test your website against security issues deploy it to a new environment and run the active scan. And only run the active scan for the sites you have permission!  <center> Active Scan </center> # Introduction to ZAP UI Let’s have a brief look to the ZAP UI layout to understand the basics. On the following screen I enumerated windows with 4 sections.  <center> Image 3: Owasp Zap UI Features </center> **1 — Modes** : On the upper-left of the screen you see modes. There are 4 modes; - **Standard Mode:** Allows you to do anything to any website. - **Attack Mode:** Active scans any websites. - **Safe Mode:** Turns off all the harmful features while scanning. - **Protected Mode:** Allow you to scan websites in a particular scope. It prevents you to scan an unwanted website **2 — Sites:** All the sites you access via the ZAP Proxy will be listed here. If your website makes a request to another website, you’ll see that under a separate site. **2.1 — Show Only URLs in Scope:** You should toggle this option on because the sites section gets ugly after some test. To focus your target website in the sites you should create a new context of your website and keep *In Scope* option checked. By doing this you will no longer see other websites that you are not interested in.  <center> Image 4 </center> 3 — **Workspace Window:** The workspace window consists of 3 tabs: 3.1 — **Quick Start Window:** It’s the direct and fastest way of starting an active scan. Enter the target website address in the *URL to attack* input and hit *the attack* button. *It first crawls the website then performs active scan.* 3.2 — **Request & Response Window:** These are the most used parts of the UI. In the request tab, you see the window is divided into 2 parts. Upper shows request’s header and cookies and the bottom shows the post parameters as being sent to server. The response windows is similar to the request window. Shows the response header and body.  <center> Image 5: An example for request & response </center> 4 — **Bottom Window:** It shows the results, the request history and the vulnerabilities of the test. The most important tab here is *Alerts* tab. 4.1 — **Alerts Tab:** It shows the vulnerabilities found on the target website. When you click one of the alerts in list (1), it opens the related request/response on the right-upper (2) and gives a neat information about the vulnerability. Let’s see what happened on the -Image 5- request tab; A POST request is made to [http://localhost:22742/api/TokenAuth/Authenticate](http://localhost:22742/api/TokenAuth/Authenticate). So a user is signing in with credentials. And the server returns http-500 error. ``` HTTP-500 Internal Server Error. ``` OWASP ZAP thinks it’s a problem. And on the 3rd window you see some information about this problem. It shows the exact URL and a yellow flag (medium risk). The description says *This page contains an error/warning message that may disclose sensitive information like the location of the file that produced the unhandled exception. This information can be used to launch further attacks against the web application. The alert could be a false positive if the error message is found inside a documentation page.* This is cool because OWASP ZAP smells some information leak. It suspects the website throws an unhandled exception. This can be really vulnerable when a website shows some exception stack to the attacker and gives info about the environment or code. But here in our example, the response is a JSON content that says “Invalid user name or password” but the developer prefers to send it via HTTP-500. It is a false-positive alert because no information is being exposed.  <center> Image 6: Asp.Net Yellow Error Page Exposes Information </center> The solution section gives information about how to overcome the issue. > Review the source code of this page. Implement custom error pages. Consider implementing a mechanism to provide a unique error reference/identifier to the client (browser) while logging the details on the server side and not exposing them to the user.  <center> Image 7: Alert Window </center> # Proxying Your Website: JxBrowser In the earlier version of OWASP ZAP, you had to configure your browser’s proxy to capture requests. But there’s a new cool feature [JxBrowser](https://www.teamdev.com/jxbrowser)! This is a Chromium-based browser integrated in OWASP ZAP. By default it has all the proxy configuration set up and lets OWASP ZAP to cross all the traffic over it. Hit the *Launch Browser* and navigate to your website.  <center> Image 8: Start JxBrowser </center> # Navigating Your Website In order to extract the tree of your website, you need to crawl the website in JxBrowser. You should hit all the features, go thru all possible actions. This phase is very important! **> The more you explore your website, the more you get efficient results.** <center> <img src="https://miro.medium.com/max/640/1*zfurDlzUpxvYjplxuu0D4g.gif"> </center> <center> Navigate your website </center> # Spidering Your Website Spidering a website means crawling all the links and getting structure of the website. ## **Why do we need spidering?** If you access all aspects of the site while navigating your website then strictly seeking you don't need to use the spider — that's there to pick on things you missed or when proxying isn't an option. This is done by right clicking of the site and selecting *Attack* from the menu, then clicking *Spider*. Be sure, *recurse* option is checked! Press the *Start Scan* button. It will take some time according to the link counts of your website. # How to pentest a SPA (Single Page Application) Website? If it’s a SPA website, then you need to tell ZAP more information, in particular that one, parameters represents application structure rather than application data. To do this: - Double click your Context (in our test it’s a modern [AspNet Zero ](https://aspnetzero.com/)SPA) - Select the ‘Structure’ tab - Add the ‘structural’ parameter(s)  <center> Image 9: Spidering Your Website </center> > If you cover all the features and actions of your SPA website, then you don’t need to spider! # Extensions There’s an extension marketplace added by the community. You can click the -*3 Colored Boxes*- icon to show up the list. To install an extension, click on the Marketplace tab and write extension name in the box. Then the click *Install Selected* button. That’s it! No need to restart. There are useful extensions! Some of them I can suggest; - Active Scanner rules - Passive Scanner rules - FuzzDB  <center> Image 10:Extensions window </center> # Configure Scan Policy Before scanning I recommend to set scan policy like shown below; From the *Analyse* menu, select *Scan Policy Manager*. Click *Modify* button. In the *Scan Policy* window set *Low => Threshold To All* and click *Go* button. Same as *Insane => Strength To All* and click *Go* button. And to save click *OK* button. This will go all the attacks in memory and makes the scan robust.  # Start Attacking Attacking the target website is very straight forward. <center> <img src="https://miro.medium.com/max/400/1*eYYTQ4zmFD6l4jNLLwxbAA.gif"> </center> 1 — Add your website to the *Context*. To do this, right click the target website in the left pane. Choose *Include in Context* and select *Default Context.* You can create a new context as well. Now you see there comes a new website URL in the pop-up window which adds your website as regular expression. Asterix (*) in URL, means attack all the URLs under this website. Before attacking, you can go thru the other options in the *Default Context* to fine tune your settings. Finally we click *OK* button.  <center> Image 11: Include Your Website In Context </center> 2 — Show only the URLs in the current scope. By doing this you hide the other websites and you prevent accidental attacks. <center> <img src="https://miro.medium.com/max/836/1*EUouLjoVduUVckhPaf2Vuw.png"> </center> <center> Image 12: Show Only URLs in the Scope </center> 3 — Run a spider scan to traverse all paths in the website.  <center> Image 13: Start Spidering </center> 4 —Attack! This is the main goal. We’ll start *Active Scan.* An active scan can insert harmful data into your database. So run it only on the allowed websites. When you click *Start Scan*, it’ll start a progress which can be time consuming depending on the URL count.  <center> Image 14: Start Active Scan </center> > Up to here we almost finished pentesting. The followings are nice to have options you can do; # Fuzzer Fuzzing is sending unexpected or random data to the inputs of a website. Normally we validate inputs on client-side that’s why we ignore some problems in the back-end. When you fuzz **key inputs** (like a main search input of the website or the login page inputs) you can see coding errors and security loopholes. This is an optional security step. If you want to run *Fuzzer*, locate to the request you want to fuzz from left the pane. Right click and choose *Attack*, then click *Fuzz.* In the *Fuzzer* window, you’ll see the request post data. Click on the post data and highlight the text you want to attack. On the right pane, click *Add* button. You’ll see *Payloads* window. Click *Add* button again. In the *Add Payload* window, choose *File Fuzzers* from type combo box. Select the file you want to use. This file is a database that will be used to brute force to the input. When it finishes, the results will be listed on the bottom tab called *Fuzzer*. The ones tagged with *Fuzzed* are suspicious and needs to be taken care.  <center> Image 15: Fuzzer </center> # Forced Browse The spider looks for known, organically linked URLs. But the spider cannot find a URL that’s not mentioned anywhere on the website. In that case, forced browsing comes in. Forced Browsing uses brute force dictionaries to check if there are any other hidden URLs like admin panel or something that can be hacked.  <center> Image 16: Forced Browse </center> # Break Break is a very good function for intercepting and modifying the requests and responses. If you want to change any particular request post data or response data, right click on the site, choose Break, in the Add Break Point window click Save. Now, on the bottom pane you’ll see breakpoint is enabled. From now on all the requests will be intercepted by OWASP ZAP tool. Whenever you make a request from the original website, the ZAP window will bring to front and allow you to modify the request. After you press the green play button on the toolbar, ZAP brings you the response sent by the server. And you can modify response as well. So your browser will retrieve the altered response.  <center> Image 17: Breakpoints </center> # Results & Report You made a good job till here. Scanned your website for the known vulnerabilities. But without reporting those issues properly, you are not complete. <center> <img src="https://miro.medium.com/max/400/1*ZazRJHflEMVs20fy-cL7wg.gif"> </center> <center> Reporting vulnerabilities </center> You can see the issues on the *Alerts* tab that is located in the bottom pane. In the following screen, there are 5 alerts with colorized flags. If you have no red flag then you are lucky! For those with red flags, first focus on them and fix them asap. <center> <img src="https://miro.medium.com/max/356/1*fu5fr-4zQRcvsssxnjAu5A.png"> </center> <center> Image 18: The flag colors show the risk level. </center> > If you don’t see any alert, then you might have done something wrong! When you click one of the alerts, it shows the related request & response window. There’s a nice reporting tool that generates a neat report file automatically. You can export reports as HTML, XML, JSON, Markdown … I generated a HTML report. You can see it’s a well-organized final report that you can send to any fellow as is. <center> <img src="https://miro.medium.com/max/978/1*I-93poX8ipmmcl7mqPdBwQ.png"> </center> <center> Image 19: Generating Report </center>  <center> Image 20: HTML Report </center> <center> <img src="https://miro.medium.com/max/400/1*J1RBAgYFFMp4wgJ0731-XQ.gif"> </center> I hope you enjoyed the article! Secure coding ;)
In this article, I will try to show you how to integrate Angular Universal to [ASP.NET Boilerplate](https://aspnetboilerplate.com/)’s ASP.NET Core & Angular template (https://github.com/aspnetboilerplate/module-zero-core-template). Then, we will build a basic application called Hero Shop. This application will display list of heroes and give you a starting point for your ASP.NET Core and Angular Universal application. ## First Words I think Single Page Application Frameworks are really good for developing backend applications nowadays. SPAs provide better developer and user experience. We are also using [Angular](https://angular.io/) in both [ABP Framework](https://aspnetboilerplate.com/)’s free templates and [AspNet Zero](https://aspnetzero.com/). SPA frameworks are not considered as a good solution for public websites because of some well known reasons like long initial app load time and SEO optimizations. Angular provides universal rendering in order to overcome these technical hassles. A more detailed information can be found here [https://universal.angular.io/](https://universal.angular.io/). ## Create the Project Let’s start by downloading a new template on [https://aspnetboilerplate.com/Templates](https://aspnetboilerplate.com/Templates). Name your project as **Acme.HeroShop.**  When you press “Create my project” button, you will get a zip file including the Visual Studio solution. You can run your application according to this document [https://aspnetboilerplate.com/Pages/Documents/Zero/Startup-Template-Angular](https://aspnetboilerplate.com/Pages/Documents/Zero/Startup-Template-Angular). ## Add the Public Project Add a new project called “**Acme.HeroShop.Web.Public**” to your “**Acme.HeroShop.sln**” solution. Select Empty project in the next step. We will configure everyting from scratch 😉.  ## Create angular-cli app It’s time to create your angular app. Run “**ng new HeroApp**” command in the root folder of your “**Acme.HeroShop.Web.Public**” project. You need **angular-cli** to be installed globally in order to run this command. You can install it if you haven’t done it before. Installation of angular-cli is explained on it’s own github page [https://github.com/angular/angular-cli#installation](https://github.com/angular/angular-cli#installation). For this article, we used angular-cli v1.6.5 . > *Latest version of Angular is v5.x at the moment but some of the npm packages we are using don’t support v5.x for universal rendering, so you need to downgrade Angular npm packages to v4.x. Just open package.json and change “*@angular/**” package versions to “4.4.6”.* Then, move **.angular-cli.json** and **package.json** to root folder of **Acme.HeroShop.Web.Public** project. This will allow you to run your commands in the root directory of **Acme.HeroShop.Web.Public** project. Change the value of root property in .angular-cli.json to “HeroApp/src”. Delete **node_modules** folder in HeroApp and then run “**npm install**” and “**ng serve**” in the root folder to see if initial angular-cli app is working. To see if it is working, you can browse [http://localhost:4200/](http://localhost:4200/). You will get the below screen if you succeed.  > *We prefer using* [*Yarn*](https://yarnpkg.com/) *to install npm dependencies. Yarn is less error prone. If you want to use Yarn, just install it by following the instructions on* [*https://yarnpkg.com/*](https://yarnpkg.com/)*.* ## Configure Angular app for universal rendering In angular-cli.json, there is an app definition in the **apps:[]** array. You will create another app for your sample. The original app definition is created for default usage which is not universal rendering. Here is the new app definition. <script src="https://gist.github.com/ismcagdas/091c17ee2ed661368a7bebed91675ff3.js"></script> Now, you can add the following packages for universal rendering: ```ts @nguniversal/aspnetcore-engine @angular/platform-server aspnet-prerendering ``` Run below commands to install these npm packages and save them into package.json file. ```bash npm install @nguniversal/aspnetcore-engine --save npm install aspnet-prerendering --save npm install @angular/platform-server --save ``` Copy “modules” folder from this link ([https://github.com/aspnetboilerplate/aspnetboilerplate-samples/tree/master/AngularUniversal/aspnet-core/src/Acme.HeroShop.Web.Public/HeroApp/src/modules](https://github.com/aspnetboilerplate/aspnetboilerplate-samples/tree/master/AngularUniversal/aspnet-core/src/Acme.HeroShop.Web.Public/HeroApp/src/modules)) to “HeroApp\src\” folder of your angular-cli app. These files are used in your **app.server.module.ts** (we will create it later) to transfer data from server to client. Open your **app.module.ts** and change it’s content like below: <script src="https://gist.github.com/ismcagdas/e4161800c514099782215ad042e77aa7.js"></script> Then, you need to create a server module. Create a file called “**app.server.module.ts**” under “HeroApp\src\app” and set it’s content to: <script src="https://gist.github.com/ismcagdas/546ce42ed7e6f7e7c96633fe6312b327.js"></script> Create a file called “**tsconfig.server.json**” under “HeroApp\src” and change it’s content to: <script src="https://gist.github.com/ismcagdas/af563cebdf42758640181cfd6e63d6d4.js"></script> As you can see, **AppServerModule** is used as the entry module instead of **AppModule** for server side rendering. Let’s create **main.server.ts** instead of using **main.ts** for server side rendering. **main.server.ts** must be in the same folder with **main.ts** which is under “HeroApp\src”. Here is your new **main.server.ts** file: <script src="https://gist.github.com/ismcagdas/0cb446ec3c165b52579a5a4a165b7baf.js"></script> Create **polyfills.server.ts** and **polyfills.client.ts** under the “HeroApp\src”. **polyfills.server.ts:** ```ts import './polyfills'; import 'zone.js'; ``` **polyfills.client.ts:** ```ts import 'zone.js/dist/zone'; ``` Since **zone.js** is imported manually in **main.server.ts**, you need to comment out zone.js importing in **polyfills.ts**, otherwise you will get a runtime error. Now, your angular-cli app is ready for server side rendering. You can start to work on ASP.NET Core application. ## Preparing ASP.NET Core app Start by disabling typescript building in your Public project by adding `<TypeScriptCompileBlocked>true</TypeScriptCompileBlocked>` between `<PropertyGroup>...</PropertyGroup>` tags because you are not trying to build typescript using Visual Studio. Now your project must be building in Visual Studio. Add an ABP nuget package reference to your public website. You can add `<PackageReference Include="Abp" Version="3.3.0" />` to **Acme.HeroShop.Web.Public.csproj** and let Visual Studio restore nuget packages. Also add a project reference to **Acme.HeroShop.Web.Core.** It provides base controller and some other features. Then, you need to create an ABP module called “**HeroShopPublicModule**”. <script src="https://gist.github.com/ismcagdas/689e7a2e06d62ecb1437478ae0f2ad96.js"></script> You also need to modify **Startup.cs** and add a **HomeController** to run your public website. Here is the Startup.cs file: <script src="https://gist.github.com/ismcagdas/5107bbd82180934b280bfd98f0649d5b.js"></script> **AddNodeServices** middleware allows to process angular-cli app on the server side. Create a new folder called “**Controllers**” and add an empty **HomeController** inside the **Controllers** folder. ```csharp public class HomeController : Controller { public IActionResult Index() { return View(); } } ``` Add the view called **Index.cshtml** for HomeController’s Index action with the below content: ```html @{ ViewData[“Title”] = “Index”; } <h2>Index</h2> ``` Your ASP.NET Core app must be up and ready at this stage. In order to transfer your server side rendered app to client, you need two basic model classes. Create a folder called “Models” at the same level with “Controllers” folder and place the below classes inside of Models folder. **SimplifiedRequest.cs:** ```csharp public class SimplifiedRequest { public object cookies { get; set; } public object headers { get; set; } public object host { get; set; } } ``` **TransferData.cs:** ```csharp public class TransferData { public dynamic request { get; set; } public object thisCameFromDotNET { get; set; } } ``` Now, you can modify HomeController to return server side rendered angular app. Here is the updated HomeController.cs: <script src="https://gist.github.com/ismcagdas/dfdeba1431f4ac3bf2ed268311a48bbe.js"></script> You need to modify Index.cshtml to display server side rendered Angular app. The rendered html is stored in ViewData[“SpaHtml”], so you can easily write it in Index.cshtml; ```html @Html.Raw(ViewData[“SpaHtml”]) ``` Run the below code using command prompt in the root directory of **Acme.HeroShop.Web.Public** project to publish your second app. “**app 1**” refers to the second app in angular-cli.json. ```bash ng build — prod — app 1 — output-hashing=false ``` This command must publish your app to “**HeroApp\dist-server**” folder. Then, press F5 to run ASP.NET Core app. It should display server side rendered app. There is nothing new on the UI because we haven’t done anything for the UI. The content of the page is served by ASP.NET Core. Here is the response of Index action:  ## Creating Hero Entities You have managed to setup your server side rendered angular app. Now, you can start creating the entities. You will have two simple entities; Hero and HeroCompany. Add a folder called “**Heroes**” to **Acme.HeroShop.Core** project and add below classes into this folder. **HeroCompany.cs:** ```csharp public class HeroCompany : Entity { public string Name { get; set; } } ``` **Hero.cs:** ```csharp public class Hero : Entity { public string Name { get; set; } public int HeroCompanyId { get; set; } } ``` Add your new entities to the **HeroShopDbContext:** ```csharp public virtual DbSet<Hero> Heroes { get; set; } public virtual DbSet<HeroCompany> HeroCompanies { get; set; } ``` After that, add a migration using Visual Studio’s Package Manager Console by running the command ```csharp Add-Migration "Added_Hero_Entities" ``` Then, run the below command to update your database. ```bash Update-Database ``` Let’s create initial hero data in **SeedHelper.cs**. Create a new class called “**InitialHeroBuilder.cs**” and use it like below in **SeedHostDb** method of **SeedHelper.cs**. ```csharp new InitialHeroBuilder(context).Create(); ``` Here is **InitialHeroBuilder.cs:** <script src="https://gist.github.com/ismcagdas/64a624bdf96df9e85037a9dab3baf7a2.js"></script> ## Creating HeroAppService You will need an API to get list of heroes and hero companies from HomeController. Let’s create app services and related DTOs. Here is the **HeroDto:** ```csharp [AutoMapFrom(typeof(Hero))] public class HeroDto : EntityDto { public string Name { get; set; } public int HeroCompanyId { get; set; } } ``` and here is the **HeroCompanyDto:** ```csharp [AutoMapFrom(typeof(HeroCompany))] public class HeroCompanyDto : EntityDto { public string Name { get; set; } } ``` This is **IHeroAppService** interface: ```csharp public interface IHeroAppService : IApplicationService { List<HeroDto> GetHeroes(int? heroCompanyId); List<HeroCompanyDto> GetHeroCompanies(); } ``` To retrieve heroes, create a new method called **GetHeroes** and to retrieve hero companies create **GetHeroCompanies** method. The implementation of **HeroAppService** will be simple, thanks to ABP Framework and it’s generic repository implementation 😄. <script src="https://gist.github.com/ismcagdas/191c8986d69476a072a3d217a75921fb.js"></script> ## Building the UI Before displaying heroes on the client, let’s add bootstrap and jquery to Angular app. To install bootstrap, run below command in the root directory of **Acme.HeroShop.Web.Public** project. ```bash npm install bootstrap --save ``` Then, add Bootstrap’s and Jquery’s css/javascript files into styles and scripts arrays of the second app in angular-cli.json. It should be like this: ```ts "styles": [ "./assets/bootstrap.min.css", "styles.css" ], "scripts": [ "../../node_modules/jquery/dist/jquery.min.js", "../../node_modules/bootstrap/dist/js/bootstrap.min.js" ] ``` *I had a problem with bundling **bootstrap.min.css** from **node_modules** folder while writing this article. There was something wrong with Bootstrap css and angular-cli working together. Bootstrap v4 has just been released a few days ago at the moment I was writing this article. Because of this problem, I have downloaded **bootstrap.min.css** and placed it under assets folder of **HeroApp**.* Bundled css and javascript files will be served from “**HeroApp/dist-server**” folder. In order to do that, add below code to **Configure** method of **Startup.cs**. ```ts app.UseStaticFiles(new StaticFileOptions { FileProvider = new PhysicalFileProvider(Path.Combine(Directory.GetCurrentDirectory(), @"HeroApp", @"dist-server")), RequestPath = new PathString("/HeroApp/dist-server") }); ``` Now, you can add **scripts.bundle.js** and **styles.bundle.css** to Index.cshtml: ```html <script src="~/HeroApp/dist-server/scripts.bundle.js" asp-append-version="true"></script> <link rel="stylesheet" href="~/HeroApp/dist-server/styles.bundle.css" asp-append-version="true" /> ``` ## Listing Heroes on the UI Let’s modify **app.component.ts** and **app.component.html** to display the list of heroes. First, create a few heroes in **app.component.ts** before binding Angular app with the server side data. <script src="https://gist.github.com/ismcagdas/355f10c589313c600cfafa856aba5e59.js"></script> As you can see in the above code, two arrays are being created and populated with dummy data in ngOnInit. Then, show this dummy data in **app.component.html:** <script src="https://gist.github.com/ismcagdas/deea7fa988ead2cee8a78728bc408566.js"></script> On the page, left side contains hero companies and right side contains hero cards. Add your hero images to “assets/images/heroes” folder with “**{hero.name}.png**” syntax. We have already done it. Now, you can build Angular app using below command: ```bash ng build --prod --app 1 --output-hashing=false ``` Then, hit F5 to run ASP.NET Core app to see what the UI looks like:  ## Listing Server-Side Data As the last step, you will show server-side data in the Angular app. In order to do that, you need to inject **IHeroAppService** into **HomeController.** In order to register **HomeController** for dependency injection, you need to derive it from **HeroShopControllerBase.** Hence, you can easily inject **IHeroAppService**. ```csharp public class HomeController : HeroShopControllerBase { private readonly IHostingEnvironment _hostingEnvironment; private readonly IHeroAppService _heroAppService; public HomeController( IHostingEnvironment hostingEnvironment, IHeroAppService heroAppService ) { _hostingEnvironment = hostingEnvironment; _heroAppService = heroAppService; } //Other codes... } ``` To configure ABP dependency injection system, you need make some changes in Startup.cs. Change the return type of **ConfigureServices** method to be able to return **IServiceProvider** and add the below code to the last line of **ConfigureServices** method. ```csharp return services.AddAbp<HeroShopPublicModule>(); ``` Then, add below code to the first line of Configure method. ```csharp app.UseAbp(options => {options.USeAbpRequestLocalization = false;}); ``` Here is the final look of **Startup.cs**: <script src="https://gist.github.com/ismcagdas/f74643df531f9fff3aa8a8b121b5e0a3.js"></script> For database connection, you need to add the below **appsettings.json** file to **Acme.HeroShop.Web.Public** project. Even you don’t need authentication, it is necessary to add **JwtBearer** configuration for ABP Framework. <script src="https://gist.github.com/ismcagdas/e3c3425ced437b6be20ff394c813b8c2.js"></script> From now on, HomeController is ready to return data from server-side to Angular app. You can set “**thisCameFromDotNET**” property of **transferData** object like below in **HomeController:** ```csharp TransferData transferData = new TransferData { request = AbstractHttpContextRequestInfo(Request), thisCameFromDotNET = new { heroCompanies = _heroAppService.GetHeroCompanies(), heroes = _heroAppService.GetHeroes(null) } }; ``` You can access the data in **ngOnInit** of **app.component.ts** like below: ```ts ngOnInit(): void { this.heroCompanies = (window as any).TRANSFER_CACHE.fromDotnet.heroCompanies; this.heroes = (window as any).TRANSFER_CACHE.fromDotnet.heroes; } ``` Here is the final **app.component.ts:** <script src="https://gist.github.com/ismcagdas/e3416747eea75e37775ed760af8118a5.js"></script> And here is the final **app.component.html:** <script src="https://gist.github.com/ismcagdas/84ebfc4e5c9576742a592ae8f1ac4bb6.js"></script> Finally, create a **Shared** folder under **Views** folder of **Acme.HeroShop.Web.Public** project and create **_Layout.cshtml** file with the below content: <script src="https://gist.github.com/ismcagdas/6144be027b157ca6310e43cc68744517.js"></script> Set, this as the layout of your **Index.cshtml**. Create a **_ViewStart.cshtml** file with the below content under the Views folder of **Acme.HeroShop.Web.Public** project. ```html @{ Layout = “_Layout”; } ``` To make this app work as a SPA, add the below bundle scripts to **Index.cshtml**. (This is needed because Angular Universal doesn’t take responsibility of SPA features.) <script src="https://gist.github.com/ismcagdas/78ced5541dcca3d32ee8cdbd284d4e3c.js"></script> Finally, here is what it looks like.  ## Summary You can access the sample project on GitHub [https://github.com/aspnetboilerplate/aspnetboilerplate-samples/tree/master/AngularUniversal](https://github.com/aspnetboilerplate/aspnetboilerplate-samples/tree/master/AngularUniversal). In order to run this project; - First, build the project using Visual Studio - Run “Update-Database” command in Visual Studio’s Package Manager Console - Press F5 to start Angular is a great framework but it requires quite amount of time to make it work with ASP.NET Core & Angular Universal. I hope this article helps you out to handle it smoothly.
ASP.NET Boilerplate v3.4 has just been released (see [change logs](https://github.com/aspnetboilerplate/aspnetboilerplate/releases/tag/v3.4.0)). In this article, I will highlight some new features and changes.  <center> We were in NDC London 2018 Conference as a partner </center> # Entity History This was one of the most wanted features for a [long time](https://github.com/aspnetboilerplate/aspnetboilerplate/issues/447) and it’s finally available. Once you configure it for your entities, it can automatically write audit logs for every property change of an entity. There are three concepts for a database change: - **Entity Change Set**: A set of changes in a single database save operation. It may contain one or more entity changes. - **Entity Change**: A change (create, update or delete) for an entity. It belongs to an entity change set. - **Entity Property Change**: A single property change of an entity. It belongs to an entity change. So, for a relational database, all these changes are stored in three tables:  <center> Entity history tables in a relational database </center> ## How to Enable? It’s enabled by default, but this just enables logging the entity history. You should also enable it for your entities in [PreInitialize](https://aspnetboilerplate.com/Pages/Documents/Module-System) method of your module. Example: ```csharp Configuration.EntityHistory.Selectors.Add("MyProjectEntities", typeof(Product), typeof(Order), typeof(Customer)); ``` This code enables the entity history logging for Product, Order and Customer entities. “MyProjectEntities” is just a unique name for this configuration (then another module can remove the configuration by this name). ## How to Change the Store? ASP.NET Boilerplate is an extensible framework. If you want to write entity history logs to another destination (for example, to a non-relational database or to a 3rd-party log provider), you can just [replace](https://aspnetboilerplate.com/Pages/Documents/Startup-Configuration#replacing-built-in-services) the IEntityHistoryStore by your own implementation. # OData AspNetCore Integration With v3.4 release, introduced the [Abp.AspNetCore.OData](https://www.nuget.org/packages/Abp.AspNetCore.OData) package. It’s an integration package to use OData for ASP.NET Core within ASP.NET Boilerplate based projects easier. See documentation for [OData AspNetCore Integration](https://aspnetboilerplate.com/Pages/Documents/OData-AspNetCore-Integration). # Allow Modules to Configure MVC Routes One problem with modular approach is that every module may want to contribute to the main application configuration. Route definition is one example. A classic route definition for an ASP.NET Core MVC application is something like that: ```csharp app.UseMvc(routes => { routes.MapRoute( name: "default", template: "{controller=Home}/{action=Index}/{id?}" ); }); ``` This code stands in Startup file and an independent module can not access to that code to add it’s own route definitions. With the ABP v3.4 release, we [introduced](https://github.com/aspnetboilerplate/aspnetboilerplate/issues/2940) a configuration point for routes. Now, a module can add a route definition (or modify existing routes) using such a code in the PreInitialize method of the [module class](https://aspnetboilerplate.com/Pages/Documents/Module-System): ```csharp public class MyModule : AbpModule { public override void PreInitialize() { Configuration.Modules .AbpAspNetCore() .RouteConfiguration.Add(routes => { routes.MapRoute( name: "blog", template: "Blog/{*article}", defaults: new {controller = "Blog", action = "ReadArticle"} ); }); } //... } ``` Any module can contribute to the route configuration like that. Finally, we should modify the app.UseMvc statement to add all routes from the configuration: ```csharp app.UseMvc(routes => { app.ApplicationServices .GetRequiredService<IAbpAspNetCoreConfiguration>() .RouteConfiguration .ConfigureAll(routes); }); ``` # Default Security Headers As [OWASP ](https://www.owasp.org/index.php/OWASP_Secure_Headers_Project#tab=Main)recommends, ASP.NET Boilerplate v3.4 [adds](https://github.com/aspnetboilerplate/aspnetboilerplate/pull/2820) some default headers to HTTP responses for security purposes: - X-Content-Type-Options = nosniff - X-XSS-Protection = 1; mode=block - X-Frame-Options = SAMEORIGIN If you want to disable it, you can do it on **app.UseAbp** in your Startup file as shown below: ```csharp app.UseAbp(options => { options.UseSecurityHeaders = false; }); ``` # Others See Github change logs for all feature and enhancements in this release: https://github.com/aspnetboilerplate/aspnetboilerplate/releases/tag/v3.4.0 # What’s Next Recently, we introduced ASP.NET Boilerplate to Microsoft ASP.NET development team at [NDC London 2018](https://medium.com/volosoft/ndc-london-2018-impressions-been-there-424fceeb9df7). They liked it much and will help this project to gain more community.  Our first goal is to provide **more introduction tutorials/videos** and make it easier to start with ASP.NET Boilerplate framework. For enhancements and new features, follow the Github [issues](https://github.com/aspnetboilerplate/aspnetboilerplate/issues) and [milestones](https://github.com/aspnetboilerplate/aspnetboilerplate/milestones).
It was an exciting experience for us to be a partner in [NDC London](https://ndc-london.com/) 2018 Conference. We had an amazing week in London. About 700 software related attendees from all around the world met together. NDC London is a set of talks for software developers and architects. It’s not only conference but aims to connect people working on the same subjects.  ## We Met The Great Guys!  <center> We introduced ABP framework to Scott Hanselman and Steve Sanderson</center> We had a **booth** in the conference hall. And we got the chance to meet with the **ASP.NET team.** We presented the ABP framework. It was a pleasure to meet Scott **Hanselman**, Damian **Edwards**, David **Fowler** and Steve **Sanderson.** They very much liked the framework. And Scott Hanselman will try to help us to make the framework much apparent by .NET community.  <center> From left to right; Alper Ebicoglu, David Fowler, Halil Ibrahim Kalkan, Damian Edwards and Phil Haack</center> We also had the chance to talk to people from all around the world. Informed them about our projects, and had great discussions at our booth. This is the first time we got on the stage with the our frameworks. ## London January is a cold time in London! London’s attractions are quieter than usual this month, so it’s a great time to enjoy a stroll around a quiet museum and avoid queues at popular attractions. As being engineers, we have visited science museum.  <center> London Eye & Science Museum</center> ## About the Organization  <center>Scott Guthrie showing off face recognition</center> NDC London 2018 was wonderful; from the foods & drinks to talks, from atmosphere to participants, it was well organized and performed. We will probably participate in next year too.  <center>Volosoft was one of the partners of the organization among industry leading companies</center>
In this article, I will explain how to use DevExtreme components into ASP.NET Zero based applications. I won’t explain details about how to configure and run ASP.NET Zero. There is already detailed explanation [**here** ](https://aspnetzero.com/Documents)about that. I will focus how to integrate DevExtreme to project and how to use it. Then I will create an advanced (paging, sorting, filtering) DevExtreme datagrid example. # Introduction to DevExtreme DevExtreme is integrated with the following libraries and frameworks: - **jQuery** versions 2.1–2.2 and 3.x - **Knockout** versions 2.2.3–2.3.0 and 3.1.0–3.4.0 - **AngularJS** versions 1.2–1.5 - **Angular** versions 2.2.1+ In this artcile I will integrate DevExtreme with using JQuery and Angular versions to ASP.NET Zero. As you know, there are Angular and JQuery version of ASP.NET Zero, too. So I will use [**DevExtreme JQuery** **version**](https://js.devexpress.com/Documentation/Guide/Getting_Started/Widget_Basics_-_jQuery/Create_and_Configure_a_Widget/) **in** [**ASP.NET Zero Core MVC & JQuery**](https://aspnetzero.com/Documents/Getting-Started-Core) and use [**DevExtreme Angular** **version**](https://github.com/DevExpress/devextreme-angular#readme) **in** [**ASP.NET Zero Core Angular**](https://aspnetzero.com/Documents/Getting-Started-Angular)**.** NOTE: Also I will show DevExtreme ASP.NET Core & MVC integration example. But we don’t offer this since it’s not easy to use with existing design. # DevExtreme JQuery ## Integration First, [download **ASP.NET Zero Core MVC & JQuery**](https://aspnetzero.com/Documents/Getting-Started-Core). To use DevExtreme in project just add scripts and styles. There are different ways (local, cdn, bower, npm) to install/reference resources. I will install required resources using npm. Adding [**devextreme npm package**](https://www.npmjs.com/package/devextreme) to **package.json:**  <center> Add devextreme npm package </center> Visual Studio automatically downloads related packages. If it is not downloaded, you can update with running [`yarn`](https://yarnpkg.com/) command in the command line. Then add required mappings to **module.exports.mappings** section in **bundle.config.js:**  <center> Add mappings to bundle.config.js </center> Add styles to **wwwroot/view-resources/Areas/App/Views/_Bundles/app-layout-libs.css** bundle in **bundle.config.js**:  <center> Add styles </center> Add scripts to **wwwroot/view-resources/Areas/App/Views/_Bundles/app-layout-libs.js** bundle in **bundle.config.js**:  <center> Add scripts to bundle.config.js </center> Finally, run `gulp` command in **.Mvc** project location to copy devextreme resources from **node_modules** to **wwwroot/lib.** Now, DevExtreme is ready to use in the project.  <center> Run gulp command under .Mvc project </center> ## Let’s do an example I will convert Audit Logs page to use DevExtreme datagrid. I will take [this example](https://js.devexpress.com/Demos/WidgetsGallery/Demo/DataGrid/CustomDataSource/jQuery/Light/) as a base code. First I removed existing table tag and added a simple div in the **.cshtml** file as shown below: <script src="https://gist.github.com/alirizaadiyahsi/b5c3be2cb66d64d4f54355d59143ca32.js"></script> Then changing the **AuditLogs/index.js** script. I commented to the added and removed lines. <script src="https://gist.github.com/alirizaadiyahsi/29e6ebb56efec97687a0b91e141b79d9.js"></script> Actually, I just added code similar to the [example that I mentioned above](https://js.devexpress.com/Demos/WidgetsGallery/Demo/DataGrid/CustomDataSource/jQuery/Light/). Now we are ready to run project and see how it looks.  <center> DevExtreme datagrid </center> ## What about other controls? I will also change DateRangePicker to [**DevExtreme DatePicker**](https://js.devexpress.com/Demos/WidgetsGallery/Demo/DateBox/Overview/jQuery/Light/). Actually there isn’t date range picker in DevExtreme library, so I will do this with using two date pickers. **.Web.Mvc\Areas\App\Views\Auditlogs\Index.cshtml:**  <center> Add date html </center> **.Web.Mvc\wwwroot\view-resources\Areas\App\Views\AuditLogs\Index.js**  <center> Add date js </center> ## And result  <center> DevExtreme date picker </center> ------ # DevExtreme Angular ## Integration First, [download ASP.NET Zero Core & Angular](https://aspnetzero.com/Documents/Getting-Started-Angular). Like JQuery version, you should add devextreme npm packages (you can use different ways like bower, CDN or local etc.). I will use npm like before. Adding [**devextreme**](https://www.npmjs.com/package/devextreme) and [**devextreme-angular**](https://www.npmjs.com/package/devextreme-angular) npm packages to **package.json:**  <center> Add devextreme npm packages </center> Then running [`yarn`](https://yarnpkg.com/) command in **angular** folder:  Added styles to **~\DevExtremeDemoSPA\angular\angular.json:**  <center> Add styles </center> Finally, DevExtreme is ready to use in the project. ## Let’s do an example In this example, I will convert AuditLogs grid to DevExtreme datagrid. I will use [this example](https://js.devexpress.com/Demos/WidgetsGallery/Demo/DataGrid/CustomDataSource/Angular/Light/) code in the project. I just added some filtering configuration. First I will change **audit-logs.component.html** file: <script src="https://gist.github.com/alirizaadiyahsi/3cc857afd6f929f40f2e0fd9be94cead.js"></script> Now I will change **audit-logs.component.ts:** <script src="https://gist.github.com/alirizaadiyahsi/e55080a0cdd6b7aa305ae5bb6f317a3b.js"></script> And lastly, I will add **DxDataGridModule** to **admin.module.ts:**  <center> Add DxDataGridModule to admin.module.ts </center> ## And the result Now we are ready to run project and see how it looks.  <center> DevExtreme ASP.NET Zero integration </center> ## More example Let’s create date example for angular version. It is quite easy like jquery version. **\angular\src\app\admin\audit-logs\audit-logs.component.html**  <center> Add date control </center> **\angular\src\app\admin\admin.module.ts**  <center> Import module </center> ## Result  <center> DevExtreme date box </center> ------ # DevExtreme ASP.NET Core & MVC ## Integration **1.** [**Download DevExtreme**](https://go.devexpress.com/DevExpressDownload_DevExtremeCompleteTrial.aspx) **2.** [**Download ASPNET Zero**](https://aspnetzero.com/Documents/Getting-Started-Core) **3. Create local nuget package source for DevExpress** To create local nuget source right click to **.Mvc** project and select **“Manage Nuget Packages”** and click setting button.  <center> Manage Nuget Packages </center> Then following window will open. Click **“+”** button and add new source. Set a name for local package source into **“Name”** input what you want. And select the folder that contains DevExtreme nuget packages **(C:\Program Files (x86)\DevExpress 17.2\DevExtreme\System\DevExtreme\Bin\AspNetCore)** for **“Source”** input. And click **“Update”** button.  <center> Creating local nuget package source </center> Then you will see the DevExtreme nuget packages. Select and install these two packages **DevExtreme.AspNet.Data** and **DevExtreme.AspNet.Core**.  <center> Install DevExtreme nuget packages </center> **4. Install bower packages** > Note: We don’t use bower our projects anymore, but DevExtreme use bower for ASP.NET Core controls. Npm packages and bower packages are different, so if you use npm to install packages, DevExtreme controls will not run, because there are missing files that are downloaded from npm. (this is only for DevExtreme ASP.NET Core & MVC) Open command promt at **.Mvc** project location and run `bower init` to create **bower.son** file and fill fields (you can type what you want). For example:  <center>Run bower init command </center> To install bower packages run `bower install --save` command. We should install **devextreme** and **devextreme-aspnet-data** packages.  <center>Run bower install — save command </center> **5. Reference and configure scripts and styles** First, add mappings to **bundle.config.js** to copy files from **bower_components** to **wwwroot/lib**. Add following lines to **module.exports.mappings** section: ```ts "bower_components/devextreme/css/dx.common.css": "devextreme/css", "bower_components/devextreme/css/dx.light.css": "devextreme/css", "bower_components/devextreme/js/dx.all.js": "devextreme/js", "bower_components/devextreme/js/dx.aspnet.mvc.js": "devextreme/js", "bower_components/devextreme-aspnet-data/js/dx.aspnet.data.js": "devextreme-aspnet-data/js", "bower_components/devextreme/css/icons": "devextreme/css/icons" ``` File will look like this:  <center>Adding mappings to bundle.config.js </center> Add styles to **bundle.config.js**  <center>Styles </center> Add scripts to **_Bundles/app-layout-libs.js** in **bundle.config.js**  <center>Scripts </center> **6. Import the** **`DevExtreme.AspNet.Mvc`** **namespace** Import the `DevExtreme.AspNet.Mvc` namespace in the **_ViewImports.cshtml** file located in the **.Mvc/Areas/AppAreaName/Views** folder. ```html @using DevExtreme.AspNet.Mvc ``` Now you can use DevExtreme Controls in ASP.NET Zero. Let’s do an example. ## Example of Usage Remove existing grid in **.Web.Mvc\Areas\App\Views\AuditLogs\Index.cshtml** and add DevExtreme grid <script src="https://gist.github.com/alirizaadiyahsi/ff177f8b68d67e685d1da0dd0d4310dc.js"></script> Also changed the corresponding javascript file: <script src="https://gist.github.com/alirizaadiyahsi/8f25f195f8848bbaec671ab9d5929d07.js"></script> ## Result  <center>DevExtreme datagrid</center> ## More example I will do date box ([this example](https://js.devexpress.com/Demos/WidgetsGallery/Demo/DateBox/Overview/NetCore/Light/)) like the examples above. **.Web.Mvc\Areas\App\Views\Auditlogs\Index.cshtml**  <center>Add date box control html</center> **.Web.Mvc\wwwroot\View-resources\Areas\App\Views\Auditlogs\index.js**  <center>Add scripts to handle data changes</center> ## Result  <center>DevExtreme date box</center> I tried to add the DevExtreme controls without changing existing design. We suggest you to use JQuery version of DevExtereme for ASP.NET Zero MVC & jQuery version rather than DevExtreme ASP.NET Core components (HTML helpers) since it better fits to our design. # Conclusion I used examples from DevExtreme Demos page. I just made minor changes. When you use DevExtreme datagrids in ASP.NET Zero you can’t properly use grid built-in filter feature. Because, DevExtreme grids suitable for full list of data. But ASP.NET Zero designed to return always paged data. If you enabled filtering feature, grid will search data in current page. You can surely return full table from server, but it is not performant if the table has too much data rows.
*I’ve been developing* [*Asp.Net Zero*](https://aspnetzero.com/) *and* [*Asp.Net Boilerplate*](https://aspnetboilerplate.com/) *web frameworks. Recently we added a Xamarin project integrated to Asp.Net Zero backend. I am sharing my experiences with this article.* Xamarin is a brilliant idea that overcomes cross-platform development difficulties. It solves dilemmas many developers face when developing cross-platform apps, separate coding languages and UI paradigms. And gives smooth user experiences with native output iOS, Android, and UWP. With Xamarin.Forms, interface design for all three platforms can be accomplished within its XAML-based framework. It’s very good to release an app with the max code sharing for 3 platforms. ## > It works… First things first, Xamarin really works. A sample application can easily be developed and published to market when you follow the getting started docs.  ## > Large variety of components library Xamarin is on a rise today. Components library grows rapidly. Components range from user interface controls to code libraries, as well as user interface themes. ## > Backed by Microsoft! After Microsoft acquired the Xamarin, it’s has been evolved too much. Microsoft provides a strong support for further Xamarin development. If you are .Net developer, Xamarin looks very promising and tempting. ## > Where there’s smoke, there’s a fire! When you dive into Xamarin development, it is not as easy as it looks from the outside. It’s established in 2011. It’s been a long time but still not mature because every day developing… You always need to follow the latest version for eliminating bugs and issues. Sometimes you might need to use Visual Studio Preview version.  ## > Reaching Localhost Web APIs When you first debug your Web API, hosted on your local computer. You may not connect it from emulator. Even you can reach your Web API from localhost, in Android emulator you need to use loopback address (10.0.2.2), for Genny Motion emulator it’s 10.0.3.2. When you debug the app from your real device then you need to set it as your computer’s local network IP. And your mobile needs to be in the same network. ## > External Network Access to Kestrel in ASP.NET Core When you want to debug Web API code at the same time with running Xamarin app, it becomes really painful. By default Kestrel configuration, it does not accept requests outside of localhost. There are some machine level configuration to make Kestrel reachable outside your computer but that’s tedious. So I run the Web Api from CLI. I made a batch file for that; ```bash CD /D "D:\Github\MyProject.Web.Host" SET ASPNETCORE_URLS=http://*:22742 dotnet run ``` ## > Debugger cannot attach on initial running When you first run to debug the Xamarin app, it starts the selected emulator, installs the application and attaches the debugger. But quite often Visual Studio hangs while attaching debugger. So what I usually do is start emulator manually. When it settles down, I start Xamarin project. ## > Switching between keyboard and mobile device… If you need to test a new feature on the real device, you often need to switch between the keyboard and mobile device. Especially if the feature is reachable in deep menu taps, it even slows down your development. For such cases, you can control your device from your computer. For Android I found the app [Vysor](https://www.vysor.io/). For sure there are alternatives. ## > Sunday driver: Xaml Previewing I’ve spent most of the time arranging a Xaml page. Visual Studio has a Forms Previewer tool that’s supposed to show the view on the fly but it never works. Everybody is complaining about it. So Visual Studio released another tool; [Xamarin Live Player](https://www.xamarin.com/live). At first I saw, I was very excited but when I tried to run my app with that I got different exceptions. The first exception was Access To Path Denied. It is being discussed in[ Xamarin Forum](https://forums.xamarin.com/discussion/102821/live-player-android-access-to-path-denied). And in the recent releases I started to get *Failed to load assembly from stream: Mono.Cecil.AssemblyResolutionException*. I already filed to [Bugzilla](https://bugzilla.xamarin.com/show_bug.cgi?id=60820). ## > Best UI development tool for Xamarin.Forms: LiveXAML There are some paid solutions for Xaml previewing. I already tried Gorilla Player XAML Live Preview but that was not working for .Net Standard. Finally I found LiveXaml. Works better than others. Great tool, response is extremely quick. It’s very useful when you want to see the change both on IOS and Android platform at the same time. See https://www.livexaml.com/ ## > ListViews… ListView is very useful control to list enumerable items. But if you don’t know how to use it, it’ll be annoying. From my experiences it best works when it’s root element of the page. When you put a ListView in ScrollView, you might have scrolling problems. Nested ListViews are another problematic case. Before you start to use a ListView, take a look at this [Xamarin document](https://developer.xamarin.com/guides/xamarin-forms/user-interface/listview/performance/#Improving_ListView_Performance) ## > Do Commit Often! Committing often is a general best practice. But when it comes to Xamarin development, it’s becomes more important. After some code change, Xamarin can stop to build and give nonsense errors. You may not understand which code breaks up things. Committing often is a safeguard, in case you make a small change in your code and something goes wrong and you lose the version you were working on. It reduces the necessity for human memory and relies on (redundant) computer storage instead. ## > Dealing with Unhandled Exceptions Catching unhandled exceptions in Xamarin is quite painful. A mobile app should never exit unexpectedly. That’s why you have to catch unhandled and unobserved task exceptions app-wide. Things are not always what they seem! When an unhandled exception occurs, typically Android will be destroying the process. So there’s not a lot can be done on the Android side.  <center> Unhandled native exception window </center> ## > Keep an eye on the “Device Logs” If you see such a native exception like above, you can open Device Log window to see what’s going on. Even it’s not clear to see the exception, it gives you some evidence of which control spoils the game. I have experienced many times where the unhandled exception handlers don’t seem to catch crashes. It could be because the operating system kills the app before the app can do anything.  <center> Opening Device Log Window</center>  <center> Filtering Device Logs: Mono & MonoDroid</center> ## > HockeyApp To catch unhandled exceptions and collect live crash reports, there’s an error reporting service called [HockeyApp](https://hockeyapp.net/). Try to integrate with that. You get 2 free apps with HockeyApp. ## > Lack of Xaml Designer There is not yet a visual designer for generating Xaml in Xamarin.Forms applications. All Xaml must be hand-written. And for me it is the biggest disadvantage. Microsoft should definitely provide a Xaml Designer. ## > Conclusion For .Net developers Xamarin is the best way to go. Sure, the output will never be native in every sense of the word. But when it comes to the way of creating app that would operate on different platforms with a single code, Xamarin has come closest in imitating a native mobile application and is definitely a technology worth taking advantage of.
We are proud to release ASP.NET Zero **v5.0** with significiant improvements and exciting new features. *For those who don’t know what is ASP.NET Zero*: > *ASP.NET Zero is a well-architected Visual Studio solution that comes with full source code. You can take it as the base solution for your next web & mobile application to develop your business code on top of it.* # Mobile (Xamarin) Application Most important feature was that we have included a new mobile application developed using Xamarin platform. It supports Android and IOS. It currently has very basic functionality: - Login, change profile picture (using camera or photo gallery) - User management (list, search, create, edit, delete, unlock) - Tenant management (list, search, create, edit, delete) - Dependency injection, localization, navigation infrastructure - C# proxy classes to use server side API We created a [development guide](https://aspnetzero.com/Documents/Development-Guide-Xamarin) that explains the solution structure as the first document. We are currently writing a [step by step development](https://aspnetzero.com/Documents/Developing-Step-By-Step) guide. We planned to add other important features of the web application by the time. But we want to keep the mobile application simpler than the web as expected. # Metronic 5 UI  We completely changed the User Interface to [Metronic’s new UI](http://keenthemes.com/metronic/), version 5.0. We not only changed theme, but also added a **visual settings** page that allows to change menu, header, footer and layout of the application per tenant/user. An example layout that has dark header with top menu:  # Setup Screen  We created a setup page that can be used to initially configure application, set admin password and create database schema for a new deployment. This page is valid only if database is not available and makes a new deployment much more easier. # Road Map We are constantly improving code base to provide you a strong architectural model and clean code base for your solutions. Next big improvement will be a Rapid Application Development tool that will create basic CRUD page for a given entity class. By the time, we think to add more automation for common tasks. Our goal is to assist you on your development progress too. For other planned features, you can check our [road map](https://aspnetzero.com/Documents/Road-Map). The items on road map are subject to change based on customer feedback and our internal decisions, but you can expect to see most of them in next versions.
In this article, I will show how to create a **web farm** on **docker** with using **redis** and **haproxy**, step by step. In this sample, there will be a **web api** project as a web server apps (.net core framework) and a load balancer (haproxy). I will reproduce the web api to test load balancer. These web api projects will use the same cache server(redis) to sharing data. Finally, I will simulate this web farm on **docker**. I won’t go into the details of how to install/configure docker on windows. There are documents about this in [here](https://docs.docker.com/docker-for-windows/install/#start-docker-for-windows). And there is a [Getting Started](https://docs.docker.com/docker-for-windows/) document as well. ## Creating Web API Project First I create basic web api project from template.   And I change the **valuescontroller** to set/get/remove memory cache keys. ```csharp using System; using System.Collections.Generic; using System.Globalization; using System.Text; using Microsoft.AspNetCore.Mvc; using Microsoft.Extensions.Caching.Distributed; namespace WebFarmExample.Controllers { [Route("api/[controller]")] public class ValuesController : Controller { private readonly IDistributedCache _memoryCache; public ValuesController(IDistributedCache memoryCache) { _memoryCache = memoryCache; } [HttpGet("SetCacheData")] public IActionResult SetCacheData() { try { var time = DateTime.Now.ToLocalTime().ToString(CultureInfo.InvariantCulture); var cacheOptions = new DistributedCacheEntryOptions { AbsoluteExpiration = DateTime.Now.AddYears(1) }; _memoryCache.Set("serverTime", Encoding.UTF8.GetBytes(time), cacheOptions); return Json(new { status = true }); } catch (Exception ex) { return Json(new { ex = ex }); } } [HttpGet("GetCacheData")] public string GetCacheData() { try { var time = Encoding.UTF8.GetString(_memoryCache.Get("serverTime")); ViewBag.data = time; return time; } catch (Exception ex) { return ex.GetBaseException().Message; } } [HttpGet("RemoveCacheData")] public bool RemoveCacheData() { _memoryCache.Remove("serverTime"); return true; } // GET api/values [HttpGet] public IEnumerable<string> Get() { return new string[] { "value1", "value2" }; } // GET api/values/5 [HttpGet("{id}")] public string Get(int id) { return "value"; } // POST api/values [HttpPost] public void Post([FromBody]string value) { } // PUT api/values/5 [HttpPut("{id}")] public void Put(int id, [FromBody]string value) { } // DELETE api/values/5 [HttpDelete("{id}")] public void Delete(int id) { } } } ``` ## Configure Web API Project for Docker To run web api project on docker, first, I am adding a **Dockerfile** to project root folder. And I am modifying it like following. ```bash FROM microsoft/aspnetcore:1.1.2 WORKDIR /app COPY . . ENTRYPOINT ["dotnet", "WebFarmExample.dll"] ``` Above lines, there is a config : aspnetcore:1.1.2 this is the version of aspnetcore.mvc version in **.csproj** file; ```xml <PackageReference Include="Microsoft.AspNetCore.Mvc" Version="1.1.2"/> ``` Also, we should change “**Copy To Output Directory**” property of **Dockerfile** to “**Copy always**” to copy this file to publish folder. If we don’t set this property, we should add **Dockerfile** to publish folder manually.  Now, web api project is ready to publish and run it on docker. ## Prepare Powershell Scripts to Build Web API Project And Run Docker First, I prepared a powershell script that named **build-all.ps1** to publish web api project and build **Dockerfile** to create docker image. ```bash # COMMON PATHS $dockerFolder = (Get-Item -Path "./" -Verbose).FullName $dOutputFolder = Join-Path $dockerFolder "outputs" $slnFolder = Join-Path $dockerFolder "../" $webapiFolder = Join-Path $slnFolder "WebFarmExample" ## CLEAR ###################################################################### Remove-Item $dOutputFolder -Force -Recurse New-Item -Path $dOutputFolder -ItemType Directory ## RESTORE NUGET PACKAGES ##################################################### Set-Location $slnFolder dotnet restore ## PUBLISH WEB API PROJECT ################################################### Set-Location $webapiFolder dotnet publish --output (Join-Path $dOutputFolder "webapi") ## CREATE DOCKER IMAGES ####################################################### # Webapi Set-Location (Join-Path $dOutputFolder "webapi") docker rmi ali/webapi -f docker build -t ali/webapi . ## FINALIZE ################################################################### Set-Location $dockerFolder ``` Actually, above script is the basic powershell script to create a publish folder for web api project and move this folder to **WebFarmExample > docker_config > outputs.** All our docker files and powershell scripts are located in **WebFarmExample > docker_config.** I created a docker-compose.yml to manage docker images. ```bash version: '2' services: ali_webapi: image: ali/webapi environment: - ASPNETCORE_ENVIRONMENT=Development ports: - "9901:80" ``` And **up.ps1** for docker compose commands. ```bash docker rm $(docker ps -aq) docker-compose up -d ali_webapi ``` ## Publishing Web API Project on Docker To publish web api project and run it on docker, I am opening powershell window on **WebFarmExample > docker_config** and runnig build script.  When I run this command, it creates an ouputs folder that contains wep api publish folder in following location.  As you can see, **Dockerfile** is published, too. Now, we can run **up.ps1** script.  Now, web api project is running on docker. Let’s check if it is realy working! First, I will check if the docker container is running. When I run the command docker ps, I can see the container is running.  And I am checking it with browser. I set port in **docker-compose.yml**, before.  I finished web api step. At next step, I will try to add redis cache to store web api keys. ## Adding Redis Cache to Web Farm I will add redis cache to web farm project. To do this I added redis configurations to **docker-compose.yml**. Latest version of the file is looking like following. ```bash version: '2' services: ali_webapi: image: ali/webapi environment: - ASPNETCORE_ENVIRONMENT=Development ports: - "9901:80" ali_redis: image: ali/redis ports: - "6379:6379" ``` And I am changing **up.ps1** script file and adding a script to up redis cache container. ```bash docker rm $(docker ps -aq) docker-compose up -d ali_redis sleep 3 docker-compose up -d ali_webapi ``` I added redis up script before web api script. If there are configurations to store cache data at the web api project startup time, we should run the redis first to don’t miss any cache data. Finally, I am adding some code in web api project to use redis cache to store cache data. Before this change we need to download redis cache extensitons. **Nuget packages** that we should download; ```bash Microsoft.Extensions.Caching.Redis StackExchange.Redis.StrongName ``` **Startup.cs** ```csharp public void ConfigureServices(IServiceCollection services) { // Add framework services. services.AddMvc(); services.AddDistributedRedisCache(options => { options.Configuration = "localhost"; options.InstanceName = "redisInstance"; }); } ``` And now, we can run project on docker with powershell scripts like in previous article. I am running powershell scripts in order (first, build-all.ps1 and second up.ps1) in location **WebFarmExample > docker_config.** Here is the powershell commands that I run.  Now, let’s check if redis cache is actually work. To understand if redis is realy running, I will call the actions in controller that are managing the cache keys. For example, I will call action that set cache key and second I will call action to get cache key. And result:  We are getting above errors, because redis is using different ip and this ip is assigned by docker automatically. To see redis ip you can run **docker inspect container_id** command.   There are two way to fix this. First is hardcoded; we can add this ip to redis configurations in web api startup class. ```csharp public void ConfigureServices(IServiceCollection services) { // Add framework services. services.AddMvc(); services.AddDistributedRedisCache(options => { options.Configuration = "172.21.0.2"; options.InstanceName = "redisInstance"; }); } ``` After this change, I am running build and up scripts again. (I forgot: before run build and up scripts we should run this script to stop docker containers **docker-compose down -v –rmi local** and also you can create a **down.ps1** script and you can add this command to in down.ps1 file) And, now I am testing web api project again and result;   There an elegant way to fix this error that I mentioned above. Final version of **Startup.cs**. ```csharp public void ConfigureServices(IServiceCollection services) { // Add framework services. services.AddMvc(); services.AddDistributedRedisCache(options => { options.Configuration = Dns.GetHostAddressesAsync("dockerconfig_ali_redis_1").Result.FirstOrDefault().ToString(); options.InstanceName = "redisInstance"; }); } ``` dockerconfig_ali_redis_1 is the name of redis container. After this changing, web api project will resolve the dns of machine that is deployed on. I finished the adding redis step. Next step; I will add the **haproxy** and finish the article. ## Adding Haproxy to Web Farm I will add haproxy to project to load balancing. To add haproxy to project, first I will add a file that named **haproxy.cfg** to configure the haproxy. ```bash global maxconn 4096 defaults mode http timeout connect 5s timeout client 50s timeout server 50s listen http-in bind *:8080 server web-1 dockerconfig_ali_webapi_1:80 server web-2 dockerconfig_ali_webapi_2:80 server web-3 dockerconfig_ali_webapi_3:80 stats enable stats uri /haproxy stats refresh 1s ``` And I am modifiying docker-compose.yml and **up.ps1** like following. **docker-compose.yml** ```bash version: '2' services: ali_redis: image: ali/redis ports: - "6379:6379" ali_webapi: image: ali/webapi environment: - ASPNETCORE_ENVIRONMENT=Staging load_balancer: image: haproxy:1.7.1 volumes: - "./haproxy.cfg:/usr/local/etc/haproxy/haproxy.cfg" ports: - "9911:8080" ``` **up.ps1** ```bash docker rm $(docker ps -aq) docker-compose up -d ali_redis sleep 3 docker-compose up -d ali_webapi sleep 2 docker-compose scale ali_webapi=3 sleep 2 docker-compose up -d load_balancer ``` And the result:   ## How can we know if it works? I am modifiying the GetCacheData action that is in **ValuesController.cs** ```csharp [HttpGet("GetCacheData")] public string GetCacheData() { try { var time = Encoding.UTF8.GetString(_memoryCache.Get("serverTime")); ViewBag.data = time; return "Server time: " + time + " - Machine name: " + Environment.MachineName; } catch (Exception ex) { return ex.GetBaseException().Message; } } ``` I added machine name to returning data to understand which web api application is calling.   After set cache data, I will try to get the same cache value for different machines.    As you can see, when I refresh the **GetCacheData** page, I can get the same value for different machines. And also it is possible to see **haproxy** is working with using **haproxy web interface**, too.  When you refresh the **GetCacheData** page, you can see that the **haproxy** is routing the requests to different machines. You can track which machine is running under **Session rate> Cur tab**. You can download source code of example project: [https://github.com/alirizaadiyahsi/WebFarmExample](https://github.com/alirizaadiyahsi/WebFarmExample)
## What is an application framework? An application framework make writing applications easier. To explain in detail, the framework takes all the complexities of interfacing with the different environment and simplifies them for you. It handles all the non-business-related details for you. So you can only focus your business and save time. The Application Framework lets you build applications from scratch with writing less code. And it prevents you from repeating yourself. ## What are advantages of using an application framework? **DRY** — Don’t Repeat Yourself! is one of the main ideas of being a good developer. When developing new web applications, we need the same requirements. Most of web applications need **login pages**, **authorization management**, **localization**, **exception handling**, **logging** and so on. Also, a high quality and large scale applications should implements **best practices of software desing and development**. Starting a new enterprise web application is difficult. Since all applications need some common tasks, we’re repeating ourselves. Many companies are developing their own **Application Frameworks or Libraries** for such common tasks to do not re-develop same things. Others are **copying** some parts of existing applications and preparing **a start point** for their new application. First approach is pretty good if your company is big enough and has time to develop such a framework. So, on the long view, writing/using a framework is the best desicion. ## What is ASP.NET Boilerplate? [ASP.NET Boilerplate](https://github.com/aspnetboilerplate/aspnetboilerplate) is developed due to the reasons mentioned above. [ASP.NET Boilerplate](https://github.com/aspnetboilerplate/aspnetboilerplate) is a starting point for new modern web applications with using **best practices** and **most popular tools**. It’s aimed to be a **solid model**, a general-purpose **application framework** and a **project template**. It is one of the most starred application framework in .net world on github. **ASP.NET Boilerplate features:** - Based on both latest **ASP.NET Core**, **ASP.NET MVC** & **Web API**. - Implements **Domain Driven Design** (Entities, Repositories, Domain Services, Application Services, DTOs, Unif Of Work… and so on) - Implements **Layered Architecture** (Domain, Application, Presentation and Infrastructure Layers). - Provides an infrastructure to develop reusable and composable **modules** for large projects. - Uses most popular **frameworks/libraries** as (probably) you’re already using. - Provides an infrastructure and make it easy to use **Dependency Injection** (uses Castle Windsor as DI container). - Provides a strict model and base classes to use **Object-Releational Mapping** easily (Directly supports EntityFramework and EntityFramework.Core). - Supports and implements **database migrations**. - Includes a simple and flexible **localization** system. - Includes an **EventBus** for server-side global domain events. - Manages **exception handling** and **validation**. - Creates **dynamic Web API layer** for application services. - Provides **base and helper classes** to implement some common tasks. - Uses **convention** over configuration principle. - Provides **project templates** for **Single-Page Applications (**with latest version of **AngularJs**) and **Multi-Page Applications**. Templates are based on **Twitter Bootstrap**. - Mostly used javascript libraries are included ana configured by default. - Creates **dynamic javascript proxies** to call application services (using dynamic Web API layer) easily. - Includes **unique APIs** for some sommon tasks: showing alerts & notifications, blocking UI, making AJAX requests… Beside these common infrastructure, a module taht named [module-zero](https://github.com/aspnetboilerplate/module-zero) is developed. It provides a role and permission based **authorization** system (implementing latest ASP.NET Identity Framework), a **setting** system, **multi-tenancy** and so on. ## ASP.NET Boilerplate Templates There are some free templates that are developed with using [**ABP Framework**](https://github.com/aspnetboilerplate/aspnetboilerplate). Also, there is an additional framework for authentication management that named [**module-zero**](https://github.com/aspnetboilerplate/module-zero) that is developed seperately from ABP Framework. - [**aspnetboilerplate/aspnetboilerplate-templates**](https://github.com/aspnetboilerplate/aspnetboilerplate-templates): Angular1.x, MVC 5.x basic layout (full .net framework) - [**aspnetboilerplate/module-zero-template**](https://github.com/aspnetboilerplate/module-zero-template)**:** Angular1.x, MVC 5.x + Abp.Zero basic layout with authorization (full .net framework) - [**aspnetboilerplate/aspnet-core-template**](https://github.com/aspnetboilerplate/aspnet-core-template)**:** Angular4, ASP.NET Core basic layout (.net core) - [**aspnetboilerplate/module-zero-core-template**](https://github.com/aspnetboilerplate/module-zero-core-template)**:** Angular4, ASP.NET Core + Abp.Zero basic layout (.net core) As you can see there are templates with different options. You can download above templates from **ABP Framework**’s site <https://aspnetboilerplate.com/Templates>. ABP Framework have a great community and it is updating, constantly.
In this article, I won’t explain what is dependency injection (DI). I will try to explain how DI in ASP.NET Core works what can we do with it and how we can use other DI containers (Autofac and Castle Windsor) with ASP.NET Core.ASP.NET Core provides a minimal feature set to use default services cotainer. But you want to use different DI containers, because DI in ASP.NET Core is vey primitive. For example, it dosn’t support property injection or advanced service registering methods. Let’s go to the examples and try to understand basics of DI in ASP.NET Core. ```csharp class Program { static void Main(string[] args) { IServiceCollection services = new ServiceCollection(); services.AddTransient<MyService>(); var serviceProvider = services.BuildServiceProvider(); var myService = serviceProvider.GetService<MyService>(); myService.DoIt(); } } public class MyService { public void DoIt() { Console.WriteLine("Hello MS DI!"); } } ``` `IServiceCollection` is the collection of the service descriptors. We can register our services in this collection with different lifestyles (Transient, scoped, singleton) `IServiceProvider` is the simple built-in container that is included in ASP.NET Core that supports constructor injection by default. We are getting regisered services with using service provider. ## Service Lifestyle/Lifetimes We can configure services with different types of lifestyles like following. ## Transient This lifestyle services are created each time they are requested. ## Scoped Scoped lifestyle services are created once per request. ## Singleton A singleton service is created once at first time it is requested and this instance of service is used by every sub-requests. Let’s try to understand better with doing some examples. First, I am creating service classes. ```csharp public class TransientDateOperation { public TransientDateOperation() { Console.WriteLine("Transient service is created!"); } } public class ScopedDateOperation { public ScopedDateOperation() { Console.WriteLine("Scoped service is created!"); } } public class SingletonDateOperation { public SingletonDateOperation() { Console.WriteLine("Singleton service is created!"); } } ``` Then I will create a service provider, register services and get these services with different lifestyle. ```csharp static void Main(string[] args) { Demo2(); } private static void Demo2() { IServiceCollection services = new ServiceCollection(); services.AddTransient<TransientDateOperation>(); services.AddScoped<ScopedDateOperation>(); services.AddSingleton<SingletonDateOperation>(); var serviceProvider = services.BuildServiceProvider(); Console.WriteLine(); Console.WriteLine("-------- 1st Request --------"); Console.WriteLine(); var transientService = serviceProvider.GetService<TransientDateOperation>(); var scopedService = serviceProvider.GetService<ScopedDateOperation>(); var singletonService = serviceProvider.GetService<SingletonDateOperation>(); Console.WriteLine(); Console.WriteLine("-------- 2nd Request --------"); Console.WriteLine(); var transientService2 = serviceProvider.GetService<TransientDateOperation>(); var scopedService2 = serviceProvider.GetService<ScopedDateOperation>(); var singletonService2 = serviceProvider.GetService<SingletonDateOperation>(); Console.WriteLine(); Console.WriteLine("-----------------------------"); Console.WriteLine(); } ``` I created service instances two times to see which one is recreating. And the result is;  As you can see, when I try to create services second time, only transient service is recreated but scoped and singleton aren’t created again. But in this example, it isn’t clear scoped service life time. In this example, both first instance and second instance are in the same scope. Let’s do another example to understand better for scoped service instances. I am modify the demo like following; ```csharp private static void Demo3() { IServiceCollection services = new ServiceCollection(); services.AddTransient<TransientDateOperation>(); services.AddScoped<ScopedDateOperation>(); services.AddSingleton<SingletonDateOperation>(); var serviceProvider = services.BuildServiceProvider(); Console.WriteLine(); Console.WriteLine("-------- 1st Request --------"); Console.WriteLine(); using (var scope = serviceProvider.CreateScope()) { var transientService = scope.ServiceProvider.GetService<TransientDateOperation>(); var scopedService = scope.ServiceProvider.GetService<ScopedDateOperation>(); var singletonService = scope.ServiceProvider.GetService<SingletonDateOperation>(); } Console.WriteLine(); Console.WriteLine("-------- 2nd Request --------"); Console.WriteLine(); using (var scope = serviceProvider.CreateScope()) { var transientService = scope.ServiceProvider.GetService<TransientDateOperation>(); var scopedService = scope.ServiceProvider.GetService<ScopedDateOperation>(); var singletonService = scope.ServiceProvider.GetService<SingletonDateOperation>(); } Console.WriteLine(); Console.WriteLine("-----------------------------"); Console.WriteLine(); } ``` And result:  As you can see, scoped service instance is created more than one times in different scopes. In this part, we tried to understand ASP.NET Core DI service provider, service collection and service registering lifestyle/lifetime. # IServiceCollection and IServiceProvider In this part, I will do more examples about IServiceCollection and IServiceProvider to understand better how DI mechanism of ASP.NET Core is working. In this step, we will examine some of ASP.NET Core DI features with doing examples. Because, some of these features are more important to understand how this features are used in framework. Also, if we know this features, we can design our project with better design. Let’s start to examples. ## Factory Method First, I will create objects to test. ```csharp public class MyService { private readonly IMyServiceDependency _dependency; public MyService(IMyServiceDependency dependency) { _dependency = dependency; } public void DoIt() { _dependency.DoIt(); } } public class MyServiceDependency : IMyServiceDependency { public void DoIt() { Console.WriteLine("Hello from MyServiceDependency"); } } public interface IMyServiceDependency { void DoIt(); } ``` And I am registering the objects for DI. ```csharp static void Main(string[] args) { FactoryMethodDemo(); } public static void FactoryMethodDemo() { IServiceCollection services = new ServiceCollection(); services.AddTransient<IMyServiceDependency, MyServiceDependency>(); // Overload method for factory registration services.AddTransient( provider => new MyService(provider.GetService<IMyServiceDependency>()) ); var serviceProvider = services.BuildServiceProvider(); var instance = serviceProvider.GetService<MyService>(); instance.DoIt(); } ``` In second registration, I used factory overload method for register service and result.  ## Instance Registration You can create an object instance before registering it. Object for instance registration. ```csharp public class MyInstance { public int Value { get; set; } } ``` And demo: ```csharp static void Main(string[] args) { InstanceRegistrationDemo(); } public static void InstanceRegistrationDemo() { var instance = new MyInstance { Value = 44 }; IServiceCollection services = new ServiceCollection(); services.AddSingleton(instance); foreach (ServiceDescriptor service in services) { if (service.ServiceType == typeof(MyInstance)) { var registeredInstance = (MyInstance)service.ImplementationInstance; Console.WriteLine("Registered instance : " + registeredInstance.Value); } } var serviceProvider = services.BuildServiceProvider(); var myInstance = serviceProvider.GetService<MyInstance>(); Console.WriteLine("Registered service by instance registration : " + myInstance.Value); } ``` First, I created a new object before registering it’s type in DI. Then I resolved the registered instance by using both **service.Implementation** and **serviceProvider.GetInstance** methods. And the result:  ## Generic Type Registration I think, this is one of the best features. Suppose we have a generic repository. If we want to register this repository for all types for DI, We should register all types separately like Repo, Repo, etc… But now, ASP.NET Core DI support generic registration. And its easy to register and resolve. ```csharp IServiceCollection services = new ServiceCollection(); services.AddTransient<MyClassWithValue>(); services.AddTransient(typeof(IMyGeneric<>), typeof(MyGeneric<>)); var serviceProvider = services.BuildServiceProvider(); var service = serviceProvider.GetService<IMyGeneric<MyClassWithValue>>(); ``` ## Multiple Registration ASP.NET Core DI supports multiple registration with different options. This feature gives us some flexibility. Let’s do example to understand better. I am defining objects to test this feature. ```csharp public interface IHasValue { object Value { get; set; } } public class MyClassWithValue : IHasValue { public object Value { get; set; } public MyClassWithValue() { Value = 42; } } public class MyClassWithValue2 : IHasValue { public object Value { get; set; } public MyClassWithValue2() { Value = 43; } } ``` I will show different usage of multiple registration feature with example. ```csharp static void Main(string[] args) { MultipleImplementation(); MultipleImplementationWithTry(); MultipleImplementationWithReplace(); } private static void MultipleImplementation() { IServiceCollection services = new ServiceCollection(); services.AddTransient<IHasValue, MyClassWithValue>(); services.AddTransient<IHasValue, MyClassWithValue2>(); var serviceProvider = services.BuildServiceProvider(); var myServices = serviceProvider.GetServices<IHasValue>().ToList(); var myService = serviceProvider.GetService<IHasValue>(); Console.WriteLine("----- Multiple Implemantation Services -----------"); foreach (var service in myServices) { Console.WriteLine(service.Value); } Console.WriteLine("----- Multiple Implemantation Service ------------"); Console.WriteLine(myService.Value); } private static void MultipleImplementationWithTry() { IServiceCollection services = new ServiceCollection(); services.AddTransient<IHasValue, MyClassWithValue>(); services.TryAddTransient<IHasValue, MyClassWithValue2>(); var serviceProvider = services.BuildServiceProvider(); var myServices = serviceProvider.GetServices<IHasValue>().ToList(); Console.WriteLine("----- Multiple Implemantation Try ----------------"); foreach (var service in myServices) { Console.WriteLine(service.Value); } } private static void MultipleImplementationWithReplace() { IServiceCollection services = new ServiceCollection(); services.AddTransient<IHasValue, MyClassWithValue>(); services.Replace(ServiceDescriptor.Transient<IHasValue, MyClassWithValue2>()); var serviceProvider = services.BuildServiceProvider(); var myServices = serviceProvider.GetServices<IHasValue>().ToList(); Console.WriteLine("----- Multiple Implemantation Replace ------------"); foreach (var service in myServices) { Console.WriteLine(service.Value); } Console.WriteLine("--------------------------------------------------"); } ``` First demo of above code, I added **IHasValue** object twice with different types. And when I want to get service, It gives me the last registered service. Second demo, if I use **TryAdd**- method, it is not registered if there is a registered service. And last one, I can replace registered service with another. You can understand better when you see the result.  # DI Options and Using Autofac and Castle Windsor In this last part, I will explain **DI options**, how to use ASP.NET Core DI with **autofac** and **castle windsor**. ## Options There is a pattern that uses custom options classes to represent a group of related settings. As usual, I will explain it with example. First, download [**Microsoft.Extensions.Options.dll**](https://www.nuget.org/packages/Microsoft.Extensions.Options/) from nuget. ```csharp public class MyTaxCalculator { private readonly MyTaxCalculatorOptions _options; public MyTaxCalculator(IOptions<MyTaxCalculatorOptions> options) { _options = options.Value; } public int Calculate(int amount) { return amount * _options.TaxRatio / 100; } } public class MyTaxCalculatorOptions { public int TaxRatio { get; set; } public MyTaxCalculatorOptions() { TaxRatio = 118; } } ``` I created a class to calculate tax that uses an option class to get information how to calculate tax. I injected this option as **IOption** generic type. And here is the usage; ```csharp static void Main(string[] args) { ServiceOptionsDemo1(); } private static void ServiceOptionsDemo1() { IServiceCollection services = new ServiceCollection(); services.AddOptions(); services.AddTransient<MyTaxCalculator>(); services.Configure<MyTaxCalculatorOptions>(options => { options.TaxRatio = 135; }); var serviceProvider = services.BuildServiceProvider(); var calculator = serviceProvider.GetService<MyTaxCalculator>(); Console.WriteLine(calculator.Calculate(100)); } ``` **IServiceCollection** has an extension method that named Configure. With using this method, we can define/change option values at compile time. When I run the code, here is the result.  In this example, we can set and use options in DI with adding values hardcoded. Of course we can read option values from an file like json. Let’s do an example for this. ## Configurations The configuration API provides a way of configuring that can be read at runtime from multiple resources. These resources can be file (.ini, .json, .xml, …), cmd arguments, environment variables, in-memory objects, etc… First, you should download [**Microsoft.Extensions.Options.ConfigurationExtensions.dll**](https://www.nuget.org/packages/Microsoft.Extensions.Options.ConfigurationExtensions/) and [**Microsoft.Extensions.Configuration.Json.dll**](https://www.nuget.org/packages/Microsoft.Extensions.Configuration.Json/) **and I will add** an **appsettings.json** to my project. ```json { "TaxOptions": { "TaxRatio": "130" } } ``` And I am modifying the demo like following. ```csharp static void Main(string[] args) { ServiceOptionsDemo2(); } private static void ServiceOptionsDemo2() { var configuration = new ConfigurationBuilder() .AddJsonFile(Path.Combine(Directory.GetCurrentDirectory(), "appsettings.json")) .Build(); IServiceCollection services = new ServiceCollection(); services.AddOptions(); services.AddScoped<MyTaxCalculator>(); services.Configure<MyTaxCalculatorOptions>(configuration.GetSection("TaxOptions")); var serviceProvider = services.BuildServiceProvider(); var calculator = serviceProvider.GetRequiredService<MyTaxCalculator>(); Console.WriteLine(calculator.Calculate(200)); } ``` I added **ConfigurationBuilder** to read options from json file. In json file, data structure is matching with **configuration.GetSection(“TaxOptions”)**. And result is:  ## Using Castle Windsor and Autofac Using another DI container is very easy. Just we are adding the nuget package and create an instance of container. ## Castle Windsor First, I am adding [**Castle.Windsor.MsDependencyInjection**](https://www.nuget.org/packages/Castle.Windsor.MsDependencyInjection/) from nuget. And usage; ```csharp IServiceCollection services = new ServiceCollection(); services.AddTransient<MyService>(); var windsorContainer = new WindsorContainer(); windsorContainer.Register( Component.For<MyService>() ); var serviceProvider = WindsorRegistrationHelper.CreateServiceProvider( windsorContainer, services ); var myService = serviceProvider.GetService<MyService>(); ``` ## Autofac For using autofac, you should add the [**Autofac.Extensions.DependencyInjection**](https://www.nuget.org/packages/Autofac.Extensions.DependencyInjection) nuget package to your project. And usage; ```csharp IServiceCollection services = new ServiceCollection(); services.AddTransient<MyService>(); var containerBuilder = new ContainerBuilder(); containerBuilder.RegisterType<MyService>(); containerBuilder.Populate(services); var container = containerBuilder.Build(); var serviceProvider = new AutofacServiceProvider(container); var myService = serviceProvider.GetService<MyService>(); ``` As you can see, after adding nuget packages, you can use them just using their own service provider or service register. If you investigate source code of the example project, you can understand better. **Source code**: [ASP.NET Core Dependency Injection Training](https://github.com/alirizaadiyahsi/aspnet-core-dependency-injection-training) * * * ## Read More: 1. [Real-Time Messaging In A Distributed Architecture Using ABP, SignalR & RabbitMQ](https://volosoft.com/blog/RealTime-Messaging-Distributed-Architecture-Abp-SingalR-RabbitMQ) 2. [ASP.NET Core 3.1 Webhook Implementation Using Pub/Sub Pattern](https://volosoft.com/blog/ASP.NET-CORE-3.1-Webhook-Implementation-Using-Pub-Sub) 3. [Why You Should Prefer Singleton Pattern over a Static Class?](https://volosoft.com/blog/Prefer-Singleton-Pattern-over-Static-Class)
We think & act global. Our open source & commercial products are being used by 10,000s of developers from all around the World!
Our commercial products are trusted by developers from 100+ countries. Open source projects are getting contributions from anyone wants to join. Global, open & transparent.