What is adapter pattern with example in c#

The Adapter pattern is a structural design pattern that allows two incompatible interfaces to work together. It involves creating a wrapper or adapter class that can translate the methods and properties of one interface to another, allowing objects that use one interface to work with objects that use a different interface.

Here’s an example of how the Adapter pattern can be used in C#:

// Adaptee interface
public interface ITarget
    void Request();

// Adaptee implementation
public class Adaptee
    public void SpecificRequest()
        Console.WriteLine("Adaptee: SpecificRequest");

// Adapter implementation
public class Adapter : ITarget
    private Adaptee _adaptee;

    public Adapter(Adaptee adaptee)
        _adaptee = adaptee;

    public void Request()
        Console.WriteLine("Adapter: Request");

// Client code
static void Main(string[] args)
    Adaptee adaptee = new Adaptee();
    ITarget target = new Adapter(adaptee);


In this example, the ITarget interface defines the target interface that the client code expects to use. The Adaptee class represents an interface that is incompatible with ITarget. The Adapter class is the bridge between the two interfaces. It wraps an instance of Adaptee and exposes a method that matches the ITarget interface, while internally calling the appropriate method of Adaptee.

When the client code calls the Request method on the target object, it is actually calling the Request method of the Adapter object. The Adapter object, in turn, calls the SpecificRequest method of the wrapped Adaptee object, effectively translating the call from the ITarget interface to the Adaptee interface.


How to improve SEO for WordPress.com blog?

Here are some tips to improve SEO for your WordPress.com blog:

  1. Choose a SEO-friendly theme: WordPress.com has many themes that are optimized for search engines. Choose a theme that is responsive, fast, and has a clean code structure.
  2. Use a SEO plugin: Install and activate a SEO plugin such as Yoast SEO or All in One SEO Pack. These plugins will help you optimize your blog posts for search engines by giving you suggestions for keywords, meta descriptions, and other SEO elements.
  3. Research and use relevant keywords: Conduct keyword research to find out what your target audience is searching for. Use these keywords in your blog posts, titles, headings, and meta descriptions.
  4. Create high-quality content: Create content that is informative, engaging, and valuable to your target audience. Use visuals, videos, and other multimedia to enhance the user experience.
  5. Optimize your images: Optimize your images by compressing them, adding alt text, and using descriptive file names. This will improve your blog’s load time and help search engines understand your content.
  6. Build internal and external links: Link to your own content within your blog posts, and also link to external sources that provide additional value to your readers. This will help search engines understand the relevance of your content.
  7. Promote your blog: Promote your blog on social media, forums, and other platforms. The more exposure your blog gets, the more likely it is to attract backlinks and improve its search engine ranking.

By following these tips, you can improve your WordPress.com blog’s SEO and attract more organic traffic to your website.

Clean Architecture in .Net

Clean Architecture is a software design pattern that emphasizes separation of concerns, maintainability, and testability in .NET applications. It was introduced by Robert C. Martin, also known as Uncle Bob, in his book “Clean Architecture: A Craftsman’s Guide to Software Structure and Design”.

Clean Architecture involves breaking down a .NET application into multiple layers, each with its own set of responsibilities and dependencies. These layers include:

  • Presentation layer: Exposes the Web API endpoints and handles HTTP requests and responses. Depends on the Application layer interfaces.
  • Application layer: Implements the use cases of the application, orchestrating the Domain layer and Infrastructure layer to perform the necessary actions. Depends on the Domain layer and Infrastructure layer interfaces.
  • Domain layer: Defines the business entities and logic of the application, independent of the infrastructure or presentation details. Implements the Domain layer interfaces.
  • Infrastructure layer: Implements the interfaces defined in the Application layer and Domain layer, providing the necessary services and resources to accomplish the tasks. Depends on external services and libraries.
  • Tests: Contains unit and integration tests for the application.

The key principle of Clean Architecture is the Dependency Inversion Principle (DIP), which states that high-level modules should not depend on low-level modules; both should depend on abstractions. This allows for flexibility in the design and promotes modularity and testability.

Implementing Clean Architecture in .NET involves using design patterns such as Dependency Injection (DI), Inversion of Control (IoC), and Separation of Concerns (SoC) to achieve loose coupling and high cohesion between the layers. This helps to make the application more modular and easier to maintain, test, and evolve over time.

├── MyApp.Api/                      # Presentation layer (Web API)
│   ├── Controllers/               # HTTP controllers
│   ├── Filters/                   # Action filters
│   ├── Program.cs                 # Web API entry point
│   └── Startup.cs                 # Web API configuration
├── MyApp.Application/              # Application layer (Use cases)
│   ├── Commands/                  # Command handlers
│   ├── Queries/                   # Query handlers
│   ├── Interfaces/                # Application interfaces (ports)
│   └── MyApp.Application.csproj   # Application layer project file
├── MyApp.Domain/                   # Domain layer (Business entities and logic)
│   ├── Entities/                  # Domain entities
│   ├── Exceptions/                # Domain exceptions
│   ├── Interfaces/                # Domain interfaces (ports)
│   ├── Services/                  # Domain services
│   └── MyApp.Domain.csproj        # Domain layer project file
├── MyApp.Infrastructure/           # Infrastructure layer (Database, I/O, external services)
│   ├── Data/                      # Data access implementation (EF Core, Dapper, etc.)
│   ├── External/                  # External services implementation (REST APIs, gRPC, etc.)
│   ├── Migrations/                # Database migrations (EF Core)
│   ├── Interfaces/                # Infrastructure interfaces (ports)
│   ├── Logging/                   # Logging implementation
│   ├── MyApp.Infrastructure.csproj# Infrastructure layer project file
│   └── SeedData/                  # Seed data for database
├── MyApp.Tests/                    # Unit and integration tests
│   ├── MyApp.UnitTests/            # Unit tests
│   └── MyApp.IntegrationTests/     # Integration tests
└── MyApp.sln                       # Solution file

This is just one example of a possible clean architecture project structure. You can adapt it to your specific needs and preferences.

Unable to load the service index for source azure devops

While creating Nuget package through Azure DevOps pipeline and adding the package to the Artifacts, you might get the following or similar errors:

error NU1301: Unable to load the service index for source https://xxxx.dev.azure.com/MyApps/_packaging/MyPackage%40Local/nuget/v3/index.json

This could usually happen when you’re Publishing the package to Nuget in a separate job or Stage from the Build step.

To resolve this, you need to do a nuget package restore before publishing the package to Nuget:

  - task: DotNetCoreCLI@2
    displayName: Nuget package restore
      command: restore
      projects: '$(workingDirectory)/**/$(projectName)*.csproj'
      feedsToUse: 'config'
      nugetConfigPath: $(workingDirectory)/nuget.config
  - task: DotNetCoreCLI@2
    displayName: Create Nuget Package
        command: pack
        versioningScheme: byBuildNumber
        arguments: '--configuration $(buildConfiguration)'
        packagesToPack: '$(workingDirectory)/**/$(projectName).csproj'
        packDestination: '$(Build.ArtifactStagingDirectory)'
  - task: NuGetAuthenticate@0
    displayName: 'NuGet Authenticate'
  - task: NuGetCommand@2
    displayName: 'NuGet push'
      command: push
      publishVstsFeed: 'MyFramework'
      allowPackageConflicts: true

The variables in the above template can be passed from a Pipeline that is triggered by a change to a branch or run manually.

Costs involved to delete/purge the blob that has Archive access tier

A blob cannot be read directly from the Archive tier. To read a blob in the Archive tier, a user must first change the tier to Hot or Cool. For example, to retrieve and read a single 1,000-GB archived blob that has been in the Archive tier for 90 days, the following charges would apply:

Data retrieval (per GB) from the Archive tier: $0.022/GB-month x 1,000 GB = $22
Rehydrate operation (SetBlobTier Archive to Hot): $5.50/10k = $0.0006
Early deletion charge: (180 – 90 days)/30 days x $0.002/GB-month x 1,000 = $5.40
Read blob operation from Hot = $0.0044/10k = $0.0001
Total = $22 + $0.0006 + $5.40 + $0.0001 = $28

  • In addition to the per-GB, per-month charge, any blob that is moved to the Archive tier is subject to an Archive early deletion period of 180 days. Additionally, for general-purpose v2 storage accounts, any blob that is moved to the Cool tier is subject to a Cool tier early deletion period of 30 days. This charge is prorated. For example, if a blob is moved to the Archive tier and then deleted or moved to the Hot tier after 45 days, the customer is charged an early deletion fee for 135 (180 minus 45) days of storage in the Archive tier.
  • When a blob is moved from one access tier to another, its last modification time doesn’t change. If you manually rehydrate an archived blob to hot tier, it would be moved back to archive tier by the lifecycle management engine. Disable the rule that affects this blob temporarily to prevent it from being archived again. Re-enable the rule when the blob can be safely moved back to archive tier. You may also copy the blob to another location if it needs to stay in hot or cool tier permanently.

Azure Archive Operations and pricing

There will be some charges associated after 180 days for the storage account.

  • Storage capacity is billed in units of the average daily amount of data stored, in gigabytes (GB), over a monthly period. For example, if you consistently used 10 GB of storage for the first half of the month, and none for the second half of the month, you would be billed for your average usage of 5 GB of storage. However, using the Cool (GPv2 accounts only) or Archive tier for less than 30 and 180 days respectively will incur an additional charge.

For more information refer to this article: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers#pricing-and-billing
For more information on pricing, please see this link: https://azure.microsoft.com/en-us/pricing/details/storage/blobs/.

Prices may vary subject to changes by Azure Services.

The request was aborted: Could not create SSL/TLS secure channel

Since most Servers are moving towards TLS 1.3 and removing TLS 1.0/1.1 support, it is important to make note of certain Server configurations that might be required to make your .Net Framework Application compatible with new TLS versions like TLS 1.2.

Just upgrading the Application to latest .Net Framework like 4.8 version, which as per documentation states it automatically handles the compatibility with newer TLS versions when older TLS versions are disabled.

I have managed to resolve the issues on my server by updating the SSL Cipher Suite Order, I had mistakenly removed some of the suites that windows suggested was for TLS1.0 and 1.1 only when in actual fact they were needed for some TLS1.2 connections as well.

I resolved my issues by:

  1. Open Run Prompt and run gpedit.msc
  2. Navigate to “Administrative Templates > Network > SSL Configuration Settings”
  3. Open SSL Cipher Suite Order
  4. Select Enabled
  5. Paste the list of suites below into the text box (make sure there are no spaces)
  6. Click Apply
  7. Restart the server



Note, these suites work for me but you may require other ones for different applications. You should be able to find a full list and more info on the suites here https://docs.microsoft.com/en-us/windows/win32/secauthn/cipher-suites-in-schannel?redirectedfrom=MSDN

You can also use a tool like IISCrypto to update the Cipher Suite order.

Modify Block Blob with Pessimistic Concurrency approach Azure

In this example, we’ll take a Block Blob and an example of a class named Assignment. The Pessimistic Concurrency approach takes a Lease on a Blob Client and allows overwrite only if the Lease is not expired else it’ll give HttpStatusCode.PreconditionFailed error. For more details, check the following document.

The code below is a .Net 6 Console App.

The Assignment Class has the following Properties:

public class Assignment
    public int Id { get; set; }
    public string Code { get; set; }
    public string Kind { get; set; }
    public double pehe { get; set; }

The Console App’s Program.cs code will fetch blob content every time and manually add another Assignment. In the 4th step, it’ll fetch content from another Blob and append the Deserialized object to the original list of Assignments being built in previous steps and finally overwrite the first Blob with all Assignments.

using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Storage.Blobs.Specialized;
using Newtonsoft.Json;
using System.Net;
using System.Text;

await PessimisticConcurrencyBlob();


async Task PessimisticConcurrencyBlob()
    Console.WriteLine("Demonstrate pessimistic concurrency");
    string connectionString = "xxxx"; //ConfigurationManager.ConnectionStrings["storage"].Con;
    string filename = "testAssignment.json";
    string containerName = "mycontainer";
    BlobServiceClient _blobServiceClient = new BlobServiceClient(connectionString);
    BlobContainerClient containerClient = _blobServiceClient.GetBlobContainerClient(containerName);

    BlobClient blobClient = containerClient.GetBlobClient(filename);
    BlobLeaseClient blobLeaseClient = blobClient.GetBlobLeaseClient();

    string filename2 = "assignments.json";
    BlobClient blobClient2 = containerClient.GetBlobClient(filename2);

        // Create the container if it does not exist.
        await containerClient.CreateIfNotExistsAsync();
        var blobAssList = await RetrieveBlobContentAsync(blobClient);
        // Upload json to a blob.
        Assignment assignment1 = new Assignment()
            Id = 8,
            Code = "ABC",
            Kind =  "Lead",
            pehe = 10.0

        var blobContents1 = JsonConvert.SerializeObject(blobAssList);
        byte[] byteArray = Encoding.ASCII.GetBytes(blobContents1);
        using (MemoryStream stream = new MemoryStream(byteArray))
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, overwrite: true);

        // Acquire a lease on the blob.
        BlobLease blobLease = await blobLeaseClient.AcquireAsync(TimeSpan.FromSeconds(60));
        Console.WriteLine("Blob lease acquired. LeaseId = {0}", blobLease.LeaseId);

        // Set the request condition to include the lease ID.
        BlobUploadOptions blobUploadOptions = new BlobUploadOptions()
            Conditions = new BlobRequestConditions()
                LeaseId = blobLease.LeaseId

        // Write to the blob again, providing the lease ID on the request.
        // The lease ID was provided, so this call should succeed.
        // Upload json to a blob.
        blobAssList = await RetrieveBlobContentAsync(blobClient);
        Assignment assignment2 = new Assignment()
            Id = 9,
            Code = "DEF",
            Kind = "Assignment",
            pehe = 20.0
        var blobContents2 = JsonConvert.SerializeObject(blobAssList);
        byteArray = Encoding.ASCII.GetBytes(blobContents2);

        using (MemoryStream stream = new MemoryStream(byteArray))
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, blobUploadOptions);

        // This code simulates an update by another client.
        // The lease ID is not provided, so this call fails.

        // Acquire a lease on the blob.
        BlobLease blobLease2 = await blobLeaseClient.AcquireAsync(TimeSpan.FromSeconds(60));
        Console.WriteLine("Blob lease acquired. LeaseId = {0}", blobLease2.LeaseId);

        // Set the request condition to include the lease ID.
        BlobUploadOptions blobUploadOptions2 = new BlobUploadOptions()
            Conditions = new BlobRequestConditions()
                LeaseId = blobLease2.LeaseId

        blobAssList = await RetrieveBlobContentAsync(blobClient);
        Assignment assignment3 = new Assignment()
            Id = 10,
            Code = "GHI",
            Kind = "Assignment",
            pehe = 30.0
        var blobContents3 = JsonConvert.SerializeObject(blobAssList);
        byteArray = Encoding.ASCII.GetBytes(blobContents3);

        using (MemoryStream stream = new MemoryStream(byteArray))
            // This call should fail with error code 412 (Precondition Failed).
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, blobUploadOptions2);

        // Calling another blob and add to first blob.
        BlobLease blobLease3 = await blobLeaseClient.AcquireAsync(TimeSpan.FromSeconds(60));
        Console.WriteLine("Blob lease acquired. LeaseId = {0}", blobLease3.LeaseId);

        // Set the request condition to include the lease ID.
        BlobUploadOptions blobUploadOptions3 = new BlobUploadOptions()
            Conditions = new BlobRequestConditions()
                LeaseId = blobLease3.LeaseId

        var blobAssList2 = await RetrieveBlobContentAsync(blobClient2);
        var blobContents4 = JsonConvert.SerializeObject(blobAssList);
        byteArray = Encoding.ASCII.GetBytes(blobContents4);

        using (MemoryStream stream = new MemoryStream(byteArray))
            // This call should fail with error code 412 (Precondition Failed).
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, blobUploadOptions3);

    catch (RequestFailedException e)
        if (e.Status == (int)HttpStatusCode.PreconditionFailed)
                @"Precondition failure as expected. The lease ID was not provided.");
        await blobLeaseClient.ReleaseAsync();

The code for fetching the Blob Content is as follows:

async Task<List<Assignment>> RetrieveBlobContentAsync(BlobClient blobClient)
    //List<Assignment> assignments = new List<Assignment>();

    var response = await blobClient.DownloadAsync();
    string content = string.Empty;
    using (var streamReader = new StreamReader(response.Value.Content))
        while (!streamReader.EndOfStream)
            content = await streamReader.ReadToEndAsync();

    var assignments = JsonConvert.DeserializeObject<List<Assignment>>(content);

    return assignments;

Append to text file AppendBlock BlobStorage Azure

Below example takes input from user in a .Net 6 Console App and appends each input to a text file on BlobStorage using AppendBlob. The connectionString is a SAS for access to the Blob Storage and should be managed in appSettings.json file.

using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Specialized;
using System.Text;

Console.WriteLine("please enter text to add to the blob: ");
string text = Console.ReadLine();

await AppendContentBlobAsync(text);


async Task AppendContentBlobAsync(string content)
    string connectionString = "xxxx";
    string filename = "test.txt";
    string containerName = "mycontainer";
    BlobServiceClient _blobServiceClient = new BlobServiceClient(connectionString);
    BlobContainerClient container = _blobServiceClient.GetBlobContainerClient(containerName);
    await container.CreateIfNotExistsAsync();
    AppendBlobClient appendBlobClient = container.GetAppendBlobClient(filename);
    if (!await appendBlobClient.ExistsAsync())
        await appendBlobClient.CreateAsync();
    using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(content)))
        await appendBlobClient.AppendBlockAsync(ms);

Entrepreneurs Can Maximize Their Business With These Data Analytics Strategies

Data analytics can tell business owners a lot about their customers, their marketing strategies, and their own products and inventory, but only if they’re used correctly. Having the right analytics tools can go a long way toward helping you map out goals for the coming months or even years, so it’s essential to look for the best possible resources and come up with a plan for integrating them with your daily practices. In the Tech Pit is a great place to start when you’re looking for the latest tech tools; here are a few tips on how to make data analytics a regular part of your operations.

Photo via Pexels

Boost your efficiency

When it comes to business practices, it’s essential to make sure that everything is running as smoothly as possible. Not only does this prevent mistakes and other issues, but ensuring efficiency will save your business money. Business Process Management can help you automate workflows within sales, accounting, and manufacturing by analyzing how all the moving parts work together in order to streamline them. As with any BPM framework, it’s important to monitor its effectiveness consistently and make changes as needed. Read more about how automation tools work with BPM to get an idea of the processes.

Manage your money

Not only can analytics save your business money by creating efficiency, it can help you manage that money more effectively by giving you insights into your cash flow and other financial data. The right accounting software can even give you detailed information on your inventory, provide invoicing tools, and help you manage profit and loss and balance sheet reports. Learn more about cloud accounting software and how they can help with tasks like figuring out which projects are profitable and which ones may need to be reconsidered.

Plan your marketing budget

Another aspect of your finances that needs to be carefully considered is the marketing budget, which means it’s crucial to figure out where to focus your spending. This can be challenging due to the sheer number of marketing options available, but data analytics can show you whether you’re on the right track. Utilizing different tools for different data can be useful depending on the information you’re looking to access, so think about what your goals are. Do you want to find out how many conversions your website has daily? Which content is getting the most attention? Look for resources that will parse out the most important data for your needs so you don’t have to spend a lot of time going through that information.

Track your customers’ experiences

As with any small business, customers are a priority, so once you know what’s working where your marketing and finances are concerned, it’s a good idea to focus some time on learning more about your customers’ experiences. Tracking their movement across your business–such as where they’re spending the most time, whether they had a negative experience with a customer service representative, or whether they’re moving on to one of your competitors–can help you figure out what your business needs to work on and what you can do to win that customer back should they decide to leave. Not only that, analytics can help you choose the best ways to engage with those customers in order to create a personalized experience.

Utilizing data analytics can be instrumental in helping you maximize your business practices, so think about what your goals are and do some research to find out more about the different tools you can use. With the right resources, you can ensure that your customers, finances, and daily operations are in good shape.

Need to get in touch with the team at In the Tech Pit? Reach out today.