Create animated Progress bar with html javascript

Below is the HTML, CSS and javascript code that creates animated Progress bar based on the percentage of number of items sold by total items.

Save the below code as html and view in Browser:

<!DOCTYPE html>
<html>
<head>
    <title>Progress Bar Update</title>
    <style>
        /* Styling for the progress bar */
        :root {
            --change: 280px;
        }
        #progressBar{
            background-color: gold;
            width: 100px;
            padding: 10px;
            border-radius: 5px;
            animation: progressBar 2s ease;
            animation-fill-mode:both; 
            text-align: center;
            box-sizing: content-box;
            border: 1px solid #ccc;
        } 


        @keyframes progressBar {
            0% { width: 0; }
            100% { width: var(--change); }
        }

        .progress-container {
            position: relative;
            background: #eee;
            border-radius: 6px;
            width: 24%;
            overflow: hidden;
        }

        .progress-container::before {
            content: "";
            position: absolute;
            top: 0;
            left: 0;
            height: 100%;
            width: 0;
            background: gold;
        }
        
        #itemProgress, #itemPerc {
            font-family: 'Gill Sans', 'Gill Sans MT', Calibri, 'Trebuchet MS', sans-serif;
            font-style: oblique;
            color: red;
        }
    </style>
</head>
<body>
    <div id="itemPerc"></div>
    <div class="progress-container">
        <div id="progressBar"></div>
    </div>
    <div id="itemProgress"></div>
    <script>
        root = document.documentElement;
        //setTimeout(function(){ root.style.setProperty('--change', percentage + "px"); }, 5000);
 
        
        // Function to update the progress bar
        function updateProgressBar(soldItems, totalItems) {
            const progressBar = document.getElementById('progressBar');
            const percentage = (soldItems / totalItems) * 100;
            //progressBar.value = percentage;
            root.style.setProperty('--change', percentage.toFixed(2) + "%");
            
            const item_progress = document.getElementById('itemProgress');
            item_progress.innerHTML = `SOLD ITEMS : ${soldItems} out of ${totalItems}`;

            const item_perc = document.getElementById('itemPerc');
            item_perc.innerHTML = `${percentage.toFixed(2)}%`;
        }

        // Function to fetch data from the API
        async function fetchDataAndUpdateProgressBar() {
            try {
                //const url = 'https://dashboard.karmm.com/api/stage/full?secret=wXhshunASXORV08hKkuE95Fe';
                //const response = await fetch(url);
                //const data = await response.json();
                const data1 = JSON.parse('{"total_item":"1500000", "sold_item":"1000000"}');
                //alert(data1);
                // Assuming the JSON response contains sold_item and total_item fields
                //const { sold_item, total_item } = data1;
                const sold_item = data1.sold_item;
                const total_item = data1.total_item;
                //alert(sold_item);
                //alert(total_item);
                updateProgressBar(sold_item, total_item);
            } catch (error) {
                console.error('Error fetching data:', error);
            }
        }

        // Call the function to fetch data and update the progress bar
        fetchDataAndUpdateProgressBar();
    </script>
</body>
</html>

What is adapter pattern with example in c#

The Adapter pattern is a structural design pattern that allows two incompatible interfaces to work together. It involves creating a wrapper or adapter class that can translate the methods and properties of one interface to another, allowing objects that use one interface to work with objects that use a different interface.

Here’s an example of how the Adapter pattern can be used in C#:

// Adaptee interface
public interface ITarget
{
    void Request();
}

// Adaptee implementation
public class Adaptee
{
    public void SpecificRequest()
    {
        Console.WriteLine("Adaptee: SpecificRequest");
    }
}

// Adapter implementation
public class Adapter : ITarget
{
    private Adaptee _adaptee;

    public Adapter(Adaptee adaptee)
    {
        _adaptee = adaptee;
    }

    public void Request()
    {
        Console.WriteLine("Adapter: Request");
        _adaptee.SpecificRequest();
    }
}

// Client code
static void Main(string[] args)
{
    Adaptee adaptee = new Adaptee();
    ITarget target = new Adapter(adaptee);

    target.Request();
}

In this example, the ITarget interface defines the target interface that the client code expects to use. The Adaptee class represents an interface that is incompatible with ITarget. The Adapter class is the bridge between the two interfaces. It wraps an instance of Adaptee and exposes a method that matches the ITarget interface, while internally calling the appropriate method of Adaptee.

When the client code calls the Request method on the target object, it is actually calling the Request method of the Adapter object. The Adapter object, in turn, calls the SpecificRequest method of the wrapped Adaptee object, effectively translating the call from the ITarget interface to the Adaptee interface.

How to improve SEO for WordPress.com blog?

Here are some tips to improve SEO for your WordPress.com blog:

  1. Choose a SEO-friendly theme: WordPress.com has many themes that are optimized for search engines. Choose a theme that is responsive, fast, and has a clean code structure.
  2. Use a SEO plugin: Install and activate a SEO plugin such as Yoast SEO or All in One SEO Pack. These plugins will help you optimize your blog posts for search engines by giving you suggestions for keywords, meta descriptions, and other SEO elements.
  3. Research and use relevant keywords: Conduct keyword research to find out what your target audience is searching for. Use these keywords in your blog posts, titles, headings, and meta descriptions.
  4. Create high-quality content: Create content that is informative, engaging, and valuable to your target audience. Use visuals, videos, and other multimedia to enhance the user experience.
  5. Optimize your images: Optimize your images by compressing them, adding alt text, and using descriptive file names. This will improve your blog’s load time and help search engines understand your content.
  6. Build internal and external links: Link to your own content within your blog posts, and also link to external sources that provide additional value to your readers. This will help search engines understand the relevance of your content.
  7. Promote your blog: Promote your blog on social media, forums, and other platforms. The more exposure your blog gets, the more likely it is to attract backlinks and improve its search engine ranking.

By following these tips, you can improve your WordPress.com blog’s SEO and attract more organic traffic to your website.

Clean Architecture in .Net

Clean Architecture is a software design pattern that emphasizes separation of concerns, maintainability, and testability in .NET applications. It was introduced by Robert C. Martin, also known as Uncle Bob, in his book “Clean Architecture: A Craftsman’s Guide to Software Structure and Design”.

Clean Architecture involves breaking down a .NET application into multiple layers, each with its own set of responsibilities and dependencies. These layers include:

  • Presentation layer: Exposes the Web API endpoints and handles HTTP requests and responses. Depends on the Application layer interfaces.
  • Application layer: Implements the use cases of the application, orchestrating the Domain layer and Infrastructure layer to perform the necessary actions. Depends on the Domain layer and Infrastructure layer interfaces.
  • Domain layer: Defines the business entities and logic of the application, independent of the infrastructure or presentation details. Implements the Domain layer interfaces.
  • Infrastructure layer: Implements the interfaces defined in the Application layer and Domain layer, providing the necessary services and resources to accomplish the tasks. Depends on external services and libraries.
  • Tests: Contains unit and integration tests for the application.

The key principle of Clean Architecture is the Dependency Inversion Principle (DIP), which states that high-level modules should not depend on low-level modules; both should depend on abstractions. This allows for flexibility in the design and promotes modularity and testability.

Implementing Clean Architecture in .NET involves using design patterns such as Dependency Injection (DI), Inversion of Control (IoC), and Separation of Concerns (SoC) to achieve loose coupling and high cohesion between the layers. This helps to make the application more modular and easier to maintain, test, and evolve over time.

MyApp/
├── MyApp.Api/                      # Presentation layer (Web API)
│   ├── Controllers/               # HTTP controllers
│   ├── Filters/                   # Action filters
│   ├── Program.cs                 # Web API entry point
│   └── Startup.cs                 # Web API configuration
├── MyApp.Application/              # Application layer (Use cases)
│   ├── Commands/                  # Command handlers
│   ├── Queries/                   # Query handlers
│   ├── Interfaces/                # Application interfaces (ports)
│   └── MyApp.Application.csproj   # Application layer project file
├── MyApp.Domain/                   # Domain layer (Business entities and logic)
│   ├── Entities/                  # Domain entities
│   ├── Exceptions/                # Domain exceptions
│   ├── Interfaces/                # Domain interfaces (ports)
│   ├── Services/                  # Domain services
│   └── MyApp.Domain.csproj        # Domain layer project file
├── MyApp.Infrastructure/           # Infrastructure layer (Database, I/O, external services)
│   ├── Data/                      # Data access implementation (EF Core, Dapper, etc.)
│   ├── External/                  # External services implementation (REST APIs, gRPC, etc.)
│   ├── Migrations/                # Database migrations (EF Core)
│   ├── Interfaces/                # Infrastructure interfaces (ports)
│   ├── Logging/                   # Logging implementation
│   ├── MyApp.Infrastructure.csproj# Infrastructure layer project file
│   └── SeedData/                  # Seed data for database
├── MyApp.Tests/                    # Unit and integration tests
│   ├── MyApp.UnitTests/            # Unit tests
│   └── MyApp.IntegrationTests/     # Integration tests
└── MyApp.sln                       # Solution file

This is just one example of a possible clean architecture project structure. You can adapt it to your specific needs and preferences.

Unable to load the service index for source azure devops

While creating Nuget package through Azure DevOps pipeline and adding the package to the Artifacts, you might get the following or similar errors:

error NU1301: Unable to load the service index for source https://xxxx.dev.azure.com/MyApps/_packaging/MyPackage%40Local/nuget/v3/index.json

This could usually happen when you’re Publishing the package to Nuget in a separate job or Stage from the Build step.

To resolve this, you need to do a nuget package restore before publishing the package to Nuget:

steps:
  - task: DotNetCoreCLI@2
    displayName: Nuget package restore
    inputs:
      command: restore
      projects: '$(workingDirectory)/**/$(projectName)*.csproj'
      feedsToUse: 'config'
      nugetConfigPath: $(workingDirectory)/nuget.config
  - task: DotNetCoreCLI@2
    displayName: Create Nuget Package
    inputs:
        command: pack
        versioningScheme: byBuildNumber
        arguments: '--configuration $(buildConfiguration)'
        packagesToPack: '$(workingDirectory)/**/$(projectName).csproj'
        packDestination: '$(Build.ArtifactStagingDirectory)'
  - task: NuGetAuthenticate@0
    displayName: 'NuGet Authenticate'
  - task: NuGetCommand@2
    displayName: 'NuGet push'
    inputs:
      command: push
      publishVstsFeed: 'MyFramework'
      allowPackageConflicts: true

The variables in the above template can be passed from a Pipeline that is triggered by a change to a branch or run manually.

Costs involved to delete/purge the blob that has Archive access tier

A blob cannot be read directly from the Archive tier. To read a blob in the Archive tier, a user must first change the tier to Hot or Cool. For example, to retrieve and read a single 1,000-GB archived blob that has been in the Archive tier for 90 days, the following charges would apply:

Data retrieval (per GB) from the Archive tier: $0.022/GB-month x 1,000 GB = $22
Rehydrate operation (SetBlobTier Archive to Hot): $5.50/10k = $0.0006
Early deletion charge: (180 – 90 days)/30 days x $0.002/GB-month x 1,000 = $5.40
Read blob operation from Hot = $0.0044/10k = $0.0001
Total = $22 + $0.0006 + $5.40 + $0.0001 = $28

  • In addition to the per-GB, per-month charge, any blob that is moved to the Archive tier is subject to an Archive early deletion period of 180 days. Additionally, for general-purpose v2 storage accounts, any blob that is moved to the Cool tier is subject to a Cool tier early deletion period of 30 days. This charge is prorated. For example, if a blob is moved to the Archive tier and then deleted or moved to the Hot tier after 45 days, the customer is charged an early deletion fee for 135 (180 minus 45) days of storage in the Archive tier.
  • When a blob is moved from one access tier to another, its last modification time doesn’t change. If you manually rehydrate an archived blob to hot tier, it would be moved back to archive tier by the lifecycle management engine. Disable the rule that affects this blob temporarily to prevent it from being archived again. Re-enable the rule when the blob can be safely moved back to archive tier. You may also copy the blob to another location if it needs to stay in hot or cool tier permanently.

Azure Archive Operations and pricing

There will be some charges associated after 180 days for the storage account.

  • Storage capacity is billed in units of the average daily amount of data stored, in gigabytes (GB), over a monthly period. For example, if you consistently used 10 GB of storage for the first half of the month, and none for the second half of the month, you would be billed for your average usage of 5 GB of storage. However, using the Cool (GPv2 accounts only) or Archive tier for less than 30 and 180 days respectively will incur an additional charge.

For more information refer to this article: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers#pricing-and-billing
For more information on pricing, please see this link: https://azure.microsoft.com/en-us/pricing/details/storage/blobs/.

Prices may vary subject to changes by Azure Services.

The request was aborted: Could not create SSL/TLS secure channel

Since most Servers are moving towards TLS 1.3 and removing TLS 1.0/1.1 support, it is important to make note of certain Server configurations that might be required to make your .Net Framework Application compatible with new TLS versions like TLS 1.2.

Just upgrading the Application to latest .Net Framework like 4.8 version, which as per documentation states it automatically handles the compatibility with newer TLS versions when older TLS versions are disabled.

I have managed to resolve the issues on my server by updating the SSL Cipher Suite Order, I had mistakenly removed some of the suites that windows suggested was for TLS1.0 and 1.1 only when in actual fact they were needed for some TLS1.2 connections as well.

I resolved my issues by:

  1. Open Run Prompt and run gpedit.msc
  2. Navigate to “Administrative Templates > Network > SSL Configuration Settings”
  3. Open SSL Cipher Suite Order
  4. Select Enabled
  5. Paste the list of suites below into the text box (make sure there are no spaces)
  6. Click Apply
  7. Restart the server

SSL SUITES:

TLS_DHE_RSA_WITH_AES_256_GCM_SHA384,TLS_DHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384_P256,TLS_DHE_RSA_WITH_AES_256_CBC_SHA,TLS_DHE_RSA_WITH_AES_128_CBC_SHA,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256_P256,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256_P256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384_P384,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256_P256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384_P384,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA_P256,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA_P256,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA_P256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA_P256,TLS_DHE_DSS_WITH_AES_128_CBC_SHA,TLS_DHE_DSS_WITH_AES_256_CBC_SHA

Note, these suites work for me but you may require other ones for different applications. You should be able to find a full list and more info on the suites here https://docs.microsoft.com/en-us/windows/win32/secauthn/cipher-suites-in-schannel?redirectedfrom=MSDN

You can also use a tool like IISCrypto to update the Cipher Suite order.

Modify Block Blob with Pessimistic Concurrency approach Azure

In this example, we’ll take a Block Blob and an example of a class named Assignment. The Pessimistic Concurrency approach takes a Lease on a Blob Client and allows overwrite only if the Lease is not expired else it’ll give HttpStatusCode.PreconditionFailed error. For more details, check the following document.

The code below is a .Net 6 Console App.

The Assignment Class has the following Properties:

public class Assignment
{
    public int Id { get; set; }
    public string Code { get; set; }
    public string Kind { get; set; }
    public double pehe { get; set; }
}

The Console App’s Program.cs code will fetch blob content every time and manually add another Assignment. In the 4th step, it’ll fetch content from another Blob and append the Deserialized object to the original list of Assignments being built in previous steps and finally overwrite the first Blob with all Assignments.

using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Azure.Storage.Blobs.Specialized;
using Newtonsoft.Json;
using System.Net;
using System.Text;

await PessimisticConcurrencyBlob();

Console.WriteLine("done");
Console.ReadLine();

async Task PessimisticConcurrencyBlob()
{
    Console.WriteLine("Demonstrate pessimistic concurrency");
    string connectionString = "xxxx"; //ConfigurationManager.ConnectionStrings["storage"].Con;
    string filename = "testAssignment.json";
    string containerName = "mycontainer";
    BlobServiceClient _blobServiceClient = new BlobServiceClient(connectionString);
    BlobContainerClient containerClient = _blobServiceClient.GetBlobContainerClient(containerName);


    BlobClient blobClient = containerClient.GetBlobClient(filename);
    BlobLeaseClient blobLeaseClient = blobClient.GetBlobLeaseClient();


    string filename2 = "assignments.json";
    BlobClient blobClient2 = containerClient.GetBlobClient(filename2);

    try
    {
        // Create the container if it does not exist.
        await containerClient.CreateIfNotExistsAsync();
        var blobAssList = await RetrieveBlobContentAsync(blobClient);
        // Upload json to a blob.
        Assignment assignment1 = new Assignment()
        {
            Id = 8,
            Code = "ABC",
            Kind =  "Lead",
            pehe = 10.0
        };
        blobAssList.Add(assignment1);

        var blobContents1 = JsonConvert.SerializeObject(blobAssList);
        byte[] byteArray = Encoding.ASCII.GetBytes(blobContents1);
        using (MemoryStream stream = new MemoryStream(byteArray))
        {
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, overwrite: true);
        }

        // Acquire a lease on the blob.
        BlobLease blobLease = await blobLeaseClient.AcquireAsync(TimeSpan.FromSeconds(60));
        Console.WriteLine("Blob lease acquired. LeaseId = {0}", blobLease.LeaseId);

        // Set the request condition to include the lease ID.
        BlobUploadOptions blobUploadOptions = new BlobUploadOptions()
        {
            Conditions = new BlobRequestConditions()
            {
                LeaseId = blobLease.LeaseId
            }
        };

        // Write to the blob again, providing the lease ID on the request.
        // The lease ID was provided, so this call should succeed.
        // Upload json to a blob.
        blobAssList = await RetrieveBlobContentAsync(blobClient);
        Assignment assignment2 = new Assignment()
        {
            Id = 9,
            Code = "DEF",
            Kind = "Assignment",
            pehe = 20.0
        };
        blobAssList.Add(assignment2);
        var blobContents2 = JsonConvert.SerializeObject(blobAssList);
        byteArray = Encoding.ASCII.GetBytes(blobContents2);

        using (MemoryStream stream = new MemoryStream(byteArray))
        {
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, blobUploadOptions);
        }

        // This code simulates an update by another client.
        // The lease ID is not provided, so this call fails.

        // Acquire a lease on the blob.
        BlobLease blobLease2 = await blobLeaseClient.AcquireAsync(TimeSpan.FromSeconds(60));
        Console.WriteLine("Blob lease acquired. LeaseId = {0}", blobLease2.LeaseId);

        // Set the request condition to include the lease ID.
        BlobUploadOptions blobUploadOptions2 = new BlobUploadOptions()
        {
            Conditions = new BlobRequestConditions()
            {
                LeaseId = blobLease2.LeaseId
            }
        };

        blobAssList = await RetrieveBlobContentAsync(blobClient);
        Assignment assignment3 = new Assignment()
        {
            Id = 10,
            Code = "GHI",
            Kind = "Assignment",
            pehe = 30.0
        };
        blobAssList.Add(assignment3);
        var blobContents3 = JsonConvert.SerializeObject(blobAssList);
        byteArray = Encoding.ASCII.GetBytes(blobContents3);

        using (MemoryStream stream = new MemoryStream(byteArray))
        {
            // This call should fail with error code 412 (Precondition Failed).
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, blobUploadOptions2);
        }

        // Calling another blob and add to first blob.
        BlobLease blobLease3 = await blobLeaseClient.AcquireAsync(TimeSpan.FromSeconds(60));
        Console.WriteLine("Blob lease acquired. LeaseId = {0}", blobLease3.LeaseId);

        // Set the request condition to include the lease ID.
        BlobUploadOptions blobUploadOptions3 = new BlobUploadOptions()
        {
            Conditions = new BlobRequestConditions()
            {
                LeaseId = blobLease3.LeaseId
            }
        };

        var blobAssList2 = await RetrieveBlobContentAsync(blobClient2);
        blobAssList.AddRange(blobAssList2);
        var blobContents4 = JsonConvert.SerializeObject(blobAssList);
        byteArray = Encoding.ASCII.GetBytes(blobContents4);

        using (MemoryStream stream = new MemoryStream(byteArray))
        {
            // This call should fail with error code 412 (Precondition Failed).
            BlobContentInfo blobContentInfo = await blobClient.UploadAsync(stream, blobUploadOptions3);
        }

    }
    catch (RequestFailedException e)
    {
        if (e.Status == (int)HttpStatusCode.PreconditionFailed)
        {
            Console.WriteLine(
                @"Precondition failure as expected. The lease ID was not provided.");
        }
        else
        {
            Console.WriteLine(e.Message);
            throw;
        }
    }
    finally
    {
        await blobLeaseClient.ReleaseAsync();
    }
}

The code for fetching the Blob Content is as follows:

async Task<List<Assignment>> RetrieveBlobContentAsync(BlobClient blobClient)
{
    //List<Assignment> assignments = new List<Assignment>();

    var response = await blobClient.DownloadAsync();
    string content = string.Empty;
    using (var streamReader = new StreamReader(response.Value.Content))
    {
        while (!streamReader.EndOfStream)
        {
            content = await streamReader.ReadToEndAsync();
        }
    }

    var assignments = JsonConvert.DeserializeObject<List<Assignment>>(content);

    return assignments;
}

Append to text file AppendBlock BlobStorage Azure

Below example takes input from user in a .Net 6 Console App and appends each input to a text file on BlobStorage using AppendBlob. The connectionString is a SAS for access to the Blob Storage and should be managed in appSettings.json file.

using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Specialized;
using System.Text;

Console.WriteLine("please enter text to add to the blob: ");
string text = Console.ReadLine();

await AppendContentBlobAsync(text);

Console.WriteLine("done");
Console.ReadLine();


async Task AppendContentBlobAsync(string content)
{
    string connectionString = "xxxx";
    string filename = "test.txt";
    string containerName = "mycontainer";
    BlobServiceClient _blobServiceClient = new BlobServiceClient(connectionString);
    BlobContainerClient container = _blobServiceClient.GetBlobContainerClient(containerName);
    await container.CreateIfNotExistsAsync();
    AppendBlobClient appendBlobClient = container.GetAppendBlobClient(filename);
    if (!await appendBlobClient.ExistsAsync())
    {
        await appendBlobClient.CreateAsync();
    }
    using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(content)))
    {
        await appendBlobClient.AppendBlockAsync(ms);
    }
}