Skip to content

Instantly share code, notes, and snippets.

@DocGreenRob
Last active May 5, 2025 21:45
Show Gist options
  • Save DocGreenRob/a7f08d56bae684f7f00583f446c27e8a to your computer and use it in GitHub Desktop.
Save DocGreenRob/a7f08d56bae684f7f00583f446c27e8a to your computer and use it in GitHub Desktop.
.Net Engineer Pro Tools
Windows Pro Tips
-----------------
powertoys - https://apps.microsoft.com/store/detail/microsoft-powertoys/XP89DCGQ3K6VLD
devtoys - https://apps.microsoft.com/store/detail/devtoys/9PGCV4V3BK4W
Visual Studio 2022 Pro Tips
---------------------------
vscoloroutput - https://marketplace.visualstudio.com/items?itemName=MikeWard-AnnArbor.VSColorOutput
solutionColor - https://marketplace.visualstudio.com/items?itemName=Wumpf.SolutionColor
save vs settings to apply to other computer - https://learn.microsoft.com/en-us/visualstudio/install/import-export-installation-configurations?view=vs-2022
Podcasts
--------
Dev interrupted
Hacking Humans
Cyber Security Headlines
Click Here
Malicious Life
The Stack Overflow Podcast
The Backend Engineering (with Hussein Nasser)
The Changelog: Software Development, Open Source
Tech Stuff
Cyberwire Daily
Techmeme Ride Home
Soft Skills Engineering
Syntax - Tasty Web Development Treats
Cyber Security Today
Software Engineering Daily
Developer Tea
Coding Blocks .NET
The Cloud Cast
JS Party: Javascript, CSS, Web Development
Go Time: Golang, Software Engineering
Cyber
Dev Questions with Tim Corey
Thoughtworks Technology Podcast
.NET Rocks!
Smashing Security
Hanselminutes with Scott Hanselman
Software Engineering
Talk Python To Me
Security Now
Darknet Diaries
Hacked
The .NET Core Podcast
The .NET MAUI Podcast
Kubernetes Podcast from Google
Adventures in .NET
Coding After Work
Base.cs Podcast
The Static Void Podcast
Tools
------
couchbase
honeycomb.io/changelog
firehydrant
logrocket
playwright
openmct
thundra.io
raygun
fly.io
appwrite
sentry.io
https://sourcegraph.com/
https://www.kolide.com/
https://entity.services/
WeekPlan
Docker Extensions
------------------
Ddosify - High-performance load testing tool
- https://github.com/ddosify/ddosify
BurpSuite
- https://portswigger.net/burp
- https://danaepp.com/
VS Tips
--------
Extract method from selected code
- Ctrl + R + M
Ctrl + K + D
Ctrl + R + G
Ctrl + M + Z (Code Maid)
Important
----------
ApplicationInsights SamplingSettings for AzFn
- https://learn.microsoft.com/en-us/azure/azure-functions/functions-host-json
Design Patterns in C#
- https://www.dofactory.com/net/factory-method-design-pattern
- https://github.com/DovAmir/awesome-design-patterns?utm_source=programmingdigest&utm_medium&utm_campaign=1493
Shopify Query
- https://shopify.engineering/reducing-bigquery-costs?utm_source=programmingdigest&utm_medium&utm_campaign=1403
Building Own Operating System
- https://o-oconnell.github.io/2023/01/12/p1os.html?utm_source=programmingdigest&utm_medium&utm_campaign=1493
Debugging Linq
- https://www.red-gate.com/simple-talk/development/dotnet-development/linq-secrets-revealed-chaining-and-debugging/
--> https://michaelscodingspot.com/debug-linq-in-csharp/
Bleeping Computer
- https://www.bleepingcomputer.com/
Utilities
---------
Handle v5.0
- https://learn.microsoft.com/en-us/sysinternals/downloads/handle?WT.mc_id=DT-MVP-5003978
Auto Increment Build #
- https://stackoverflow.com/questions/826777/how-to-have-an-auto-incrementing-version-number-visual-studio
Phylosophy
----------
1. Do I have to have a "purpose" to have an address in the USA?
- if yes, then as a Human being I must have a purpose? Seriously? Ok, a purpose to whom? To whom must I state my pupose or execute or report to about...???
2. System Failure - Zero Day Exploit
3. Good PR example - https://github.com/dotnet/aspnetcore/pull/45587/files
App Insights Log Queries
------------------------
availabilityResults
| where timestamp > datetime("2022-12-19T04:07:00.000Z") and timestamp < datetime("2022-12-20T04:07:00.000Z")
| where customDimensions["WebtestArmResourceName"] == "availability-test-1-app-notepad-physical-activity-dev-eastus"
| where true and true
| extend percentage = toint(success) * 100
| summarize avg(percentage) by bin(timestamp, 1h)
| render timechart
******************************************************************
@DocGreenRob
Copy link
Author

  1. Install the "dotnet-svcutil" tool:

dotnet tool install --global dotnet-svcutil

  1. Generate the proxy classes from the WSDL file:

dotnet-svcutil MyService.wsdl --output MyServiceProxy.cs

  1. Create a new .NET 6 console app

dotnet new console -n MyWcfClientApp
cd MyWcfClientApp

  1. Add the System.ServiceModel.Primitives NuGet package:

dotnet add package System.ServiceModel.Primitives

  1. Program.cs
    using System;
    using System.ServiceModel;
    using MyServiceNamespace; // Replace this with the correct namespace from MyServiceProxy.cs
class Program
{
    static async Task Main(string[] args)
    {
        // Replace the endpoint URL with the actual URL of your WCF service
        var endpointUrl = "http://localhost:8080/MyService.svc";
        var binding = new BasicHttpBinding();
        var endpoint = new EndpointAddress(endpointUrl);

        var channelFactory = new ChannelFactory<IMyService>(binding, endpoint);
        IMyService client = channelFactory.CreateChannel();

        try
        {
            // Replace the method name and parameters with those from your WCF service
            string result = await client.MyMethodAsync("parameter");
            Console.WriteLine($"Result from the WCF service: {result}");
        }
        catch (Exception ex)
        {
            Console.WriteLine($"Error: {ex.Message}");
        }
        finally
        {
            ((IClientChannel)client).Close();
            channelFactory.Close();
        }
    }
}
  1. Run
dotnet build
dotnet run

@DocGreenRob
Copy link
Author

xcopy /D /E /C /Q /H /R /Y /K

source Specifies the file(s) to copy.
destination Specifies the location or name of new files.
/A Copies only files with the archive attribute set, doesn't change the attribute.
/M Copies only files with the archive attribute set, turns off the archive attribute.
/D:m-d-y Copies files changed on or after the specified date. If no date is given, copies only those files whose source time is newer than the destination time.
/EXCLUDE:file1 [+file2][+file3]... Specifies a list of files containing strings. When any of the strings match any part of the absolute path of the file to be copied, that file will be excluded from being copied. For example, specifying a string like \obj\ or .obj will exclude all files underneath the directory obj or all files with the .obj extension respectively.
/P Prompts you before creating each destination file.
/S Copies directories and subdirectories except empty ones.
/E Copies directories and subdirectories, including empty ones. Same as /S /E. May be used to modify /T.
/V Verifies each new file.
/W Prompts you to press a key before copying.
/C Continues copying even if errors occur.
/I If destination does not exist and copying more than one file, assumes that destination must be a directory.
/Q Does not display file names while copying.
/F Displays full source and destination file names while copying.
/L Displays files that would be copied.
/H Copies hidden and system files also.
/R Overwrites read-only files.
/T Creates directory structure, but does not copy files. Does not include empty directories or subdirectories. /T /E includes empty directories and subdirectories.
/U Copies only files that already exist in destination.
/K Copies attributes. Normal Xcopy will reset read-only attributes.
/N Copies using the generated short names.
/O Copies file ownership and ACL information.
/X Copies file audit settings (implies /O).
/Y Suppresses prompting to confirm you want to overwrite an existing destination file.
/-Y Causes prompting to confirm you want to overwrite an existing destination file.
/Z Copies networked files in restartable mode.
/B Copies the Symbolic Link itself versus the target of the link.
/J Copies using unbuffered I/O. Recommended for very large files.

@DocGreenRob
Copy link
Author

DocGreenRob commented May 16, 2023

Running ngrok

https://dashboard.ngrok.com/get-started/setup

https://ngrok.com/docs/ngrok-agent/config/

  1. cd D:\utils\ngrok-v3-stable-windows-amd64
  2. Launch> .exe (toolbar)
  3. ngrok
  4. ngrok config check
  5. my config:
version: "2"
authtoken: 2OvytBhr3ZdoB5Ke5vX7PuyCl6K_5TKJkNbv88anruazTsNrt
tunnels:
  app:
    addr: 4200
    proto: http
  other-app:
    addr: localhost:7077
    proto: http
  1. To run: ngrok start --all

Helpful Commands

ngrok

(this will give location of ngrok.yml)
ngrok config check

(run on port 80 as http)
ngrok http 80

(use this config and start)
ngrok agent -config=ngrok.yml start-all

(check the config)
ngrok config check

(stop anything running)
ngrok agent stop

(start using default config)
ngrok start --all

@DocGreenRob
Copy link
Author

Goal:
You want to use a no-code solution to add UI tests for your application. You want to use Rainforest QA as they have the best solution avaialble for such. Therefore you have to expose your local development (F5) experience to the internet so Rainforest's QA running VM (virtual machine) can connect to it and run its tests against it. Rainforest QA tests require a valid extension (.net, .com, .org, etc) and our Prod and Test are both intranet sites without such extensions.

How can we make https://localhost:57200/ have an alias .com???

Ngrok and IIS! :-)

Assumption:
1. F5 renders https://localhost:57200/
When you debug your application it runs locally on port 57200
2. Your application is running in debug mode

  1. Ngrok

    1. Download ngrok
    2. After unzipping I saved the directory in C:\Robert.._utils\ because this is a utility and I want to remember later easily
  2. IIS Application

    1. Create a new Website in IIS
    2. Set the path equal to your local development path where your app runs while debugging (i.e., C:\TFS\ApplicationTeam\Dev\TechPublications)
      a. Keep other defaults
  3. Setup AppPool Identity

    1. Go to the AppPool and find the new website you just created
    2. Right click > Advanced Settings > Identity > "…" (to edit)
    3. Select "Custom Account" > Set
      a. Enter your Parker Domain\UserID (i.e., USPHC\123456)
    4. Ok and Confirm to complete and close
  4. Setup Ngrok

    1. Go to folder where ngrok.exe is located

    2. Double click ngrok.exe

    3. This will open CLI
      a. Find the location of the ngrok.yml (config) file
      ngrok config check
      This will output the location of the .yml file.
      Go to the folder and open this file.
      b. Update the file by adding the tunnels entry to the .yml file
      i. Enter the host localhost:57220
      ii. Specify the protocol as http
      iii. Save and close (remember this is .yml so make sure the spaces are correct - online .yml checker
      c. Now start ngrok: ngrok start --all
      i. You will see the IP address ngrok assigned to your local running debug session
      ii. Copy the IP address

    4. Verify it is working by navigating to the URL

    5. Update IIS Website Bindings
      a. Go back into IIS
      b. Go to the site we created in Step 2 above
      c. Right click > Edit Bindings
      d. You will add 3 Bindings:
      i. Local pointer to your running application (port 57220):
      1) Type: http
      2) IP Address: All Unassigned
      3) Host Name: (leave empty)
      4) Port: 57220
      ii. Http mapping from ngrok (port 80):
      1) Type: http
      2) IP Address: All Unassigned
      3) Host Name: Ngrok IP Address
      4) Port: 80
      iii. Https mapping from ngrok (port 443):
      1) Type: https
      2) IP Address: All Unassigned
      3) Host Name: Ngrok IP Address
      4) Port: 80
      e. Stop the Website
      f. Stop the App Pool
      g. Start the App Pool
      h. Start the Website
      i. You should be able to navigate to your local running debug session from IIS by clicking on the websites on the right

    6. Setup Rainforest QA

    7. Go to Rainforest QA and signup for an account

    8. Create a new test

    9. Click "Add a New Site" (if it is not defaulted to it)

    10. Click "Create"

    11. You will see the Virtual Machine spin up and render the "Visit Site" page that shows up with you browse to the ngrok endpoint

    12. You are now ready to create tests (see video)

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

Powerful Data Table

https://handsontable.com/demo

@DocGreenRob
Copy link
Author

See VB Script behind Excel data

Alt + F11

@DocGreenRob
Copy link
Author

Random Key Generator

https://randomkeygen.com/

@DocGreenRob
Copy link
Author

Set Execution Policy (Powershell)

Set-ExecutionPolicy RemoteSigned

Set-ExecutionPolicy Restricted

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

Great! Let's create a basic web scraper using Python that will scrape a simple, publicly accessible news website. We'll use BeautifulSoup for parsing HTML content and requests for making HTTP requests. Our scraper will extract article titles and their corresponding URLs from the website's main page. Here's a step-by-step plan:

  1. Setup: Install BeautifulSoup and requests libraries.
  2. Fetching Web Page Content: Use requests to fetch the HTML content of the news website.
  3. Parsing HTML Content: Use BeautifulSoup to parse the HTML and extract the desired data.
  4. Storing Results: Store the extracted titles and URLs in a structured format like a CSV file.
  5. Handling Exceptions: Implement error handling to manage potential issues like network problems.
  6. Ethical Considerations: Brief discussion on ethical web scraping practices.

I'll provide a sample code for these steps. You can then run this code in your local Python environment. Let's start with the code:

import requests
from bs4 import BeautifulSoup
import csv

def scrape_news_site(url):
    # Send a request to the website
    response = requests.get(url)
    # Parse the HTML content of the page
    soup = BeautifulSoup(response.text, 'html.parser')
    
    # Find all article elements (this depends on the website's structure)
    articles = soup.find_all('article')

    news_data = []

    for article in articles:
        title = article.find('h2').get_text()  # Example, change the tag based on the actual structure
        link = article.find('a')['href']
        news_data.append({'title': title, 'url': link})

    return news_data

def save_to_csv(data, filename):
    with open(filename, 'w', newline='', encoding='utf-8') as file:
        writer = csv.DictWriter(file, fieldnames=['title', 'url'])
        writer.writeheader()
        for row in data:
            writer.writerow(row)

def main():
    url = 'https://example-news-site.com'  # Replace with the actual URL
    news_data = scrape_news_site(url)
    save_to_csv(news_data, 'news_data.csv')

    print("Scraping completed and data saved to news_data.csv")

if __name__ == "__main__":
    main()

In this example, we're assuming a simple structure for the news website where each article is contained in an <article> tag, and the title is in an <h2> tag within it. The actual tags and structure would depend on the specific website you're scraping.

Remember, this is a basic example for educational purposes. Real-world web scraping can be more complex due to the varied and dynamic nature of websites.

Feel free to run this script in your Python environment, and let me know if there are specific aspects of web scraping or ethical discussions you'd like to delve into further!

@DocGreenRob
Copy link
Author

.NET Conf 2023 Links

  1. https://github.com/sayedihashimi/restaurantservice
  2. VS Code Extension Building - Github @ kinfey

@DocGreenRob
Copy link
Author

DocGreenRob commented Nov 27, 2023

Handsontable

<style type="text/css">
        /*#requestTable .ht_master .htCore > tbody > tr:first-child > td {
            background-color: #b6ff00;
        }*/

        #requestTable .ht_master .htCore > tbody > tr:nth-child(even) > td {
            background-color: #f3f3f3;
        }

        #requestTable .ht_master .htCore > tbody > tr > td {
            font-size: 16px;
        }

        #requestTable .ht_master .htCore > thead > tr > th {
            font-size: 22px;
            font-weight:bold;
            color:blue;
            background-color:blue;
        }

        #requestTable .ht_master .htCore > tbody > tr > th {
            font-size: 22px;
            font-weight:bold;
            color:blue;
            background-color:blue;
        }
    </style>
    <script type="text/javascript">
        $(document).ready(() => {
            return;
            var _ = <%= GetJsonTableData() %>;
            const [header, ...rows] = _;

            console.log(_);
            const container = document.querySelector('#requestTable');

            const hot = new Handsontable(container, {
                contextMenu: true,
                columnSorting: true,
                //contextMenu: ['remove_row'
                //    , 'undo'
                //    , 'copy'
                //    , 'copy_with_column_headers'
                //    , 'copy_with_column_group_headers'
                //    , 'freeze_column'
                //    , 'unfreeze_column'
                //    , 'filter_by_condition'
                //    , 'filter_operators'
                //    , 'filter_by_value'
                //    , 'filter_action_bar'],
                //columnSorting: {
                //    initialConfig: {
                //        column: 1,
                //        sortOrder: 'desc'
                //    },
                //    // sort empty cells as well
                //    sortEmptyCells: true,
                //    // display the arrow icon in the column header
                //    indicator: true,
                //    // disable clicking on the column header to sort the column
                //    headerAction: true,
                //    // add a custom compare function
                //    //compareFunctionFactory(sortOrder, columnMeta) {
                //    //    return function (value, nextValue) {
                //    //        // some value comparisons which will return -1, 0 or 1...
                //    //    }
                //    //}
                //},
                // set any other column's width to the default 50px (note that longer cell values and column names can get cut)
                colWidths: [80, 100, 100, 100, 200, 100, 100, 100, 100, 100],
                readOnly: true,
                data: rows,
                //fixedRowsTop: 1,
                rowHeaders: true,
                colHeaders: header,
                height: '50vh',
                width: '100vw',
                stretchH: 'all',
                // enable filtering
                filters: true,
                // enable the column menu
                dropdownMenu: true,
                manualColumnResize: true,
                licenseKey: 'non-commercial-and-evaluation', // for non-commercial use only
                afterOnCellMouseUp: function (event, coords, TD) {
                    if (coords.row == 0) {
                        return;
                    }
                    if (coords.row < 0 || coords.col < 0) {
                        return;
                    }

                    var row = coords.row;
                    var data = this.getDataAtRow(row);

                    var id = data[0];
                    var documentType = data[15];
                    var customer = data[8];
                    var mil_comm = data[10];
                    var partNumber = data[1];
                    var program = data[7];
                    var division = data[11];
                    var status = data[17];
                    var requestor = data[13];
                    var requestStatus = data[20];
                    var pdf = data[19];

                    //alert(`Id: ${id}`);
                    //alert(`Document Type: ${documentType}`);

                    var destination = '';

                    if (documentType == 'CMM') {
                        destination = `CMMRequest?ID=${id}`;
                    } else {
                        destination = `ServiceBulletinRequest?ID=${id}`;
                    }

                    var port = window.location.port;
                    var url = '';

                    if (port == undefined
                        || port == null
                        || port == '') {
                        url = `${window.location.protocol}//${window.location.hostname}/techpubs/${destination}`;
                    } else {
                        url = `${window.location.protocol}//${window.location.hostname}:${port}/${destination}`;
                    }

                    window.location = `${url}`;
                }
            });

            hot.alter('remove_col', 10);
        })
    </script>

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

DocGreenRob commented Dec 20, 2023

SQL: Batch Deletes

DECLARE @BatchSize INT = 10000;
DECLARE @RowCount INT = @BatchSize;

BEGIN TRY
	BEGIN TRAN;

		WHILE @RowCount = @BatchSize
			BEGIN
				DELETE FROM "x".dbo.my_table
				WHERE contact_id IN (
					SELECT TOP(@BatchSize) contact_id FROM "x".dbo.my_table
					WHERE created_date <= '2020-12-31'
  					  AND (activity_code = 'APPEAL' 
					  OR activity_code = 'EO')
				);
				SET @RowCount = @@ROWCOUNT;
			END

		SET @RowCount = @BatchSize;

		WHILE @RowCount = @BatchSize
			BEGIN
				DELETE TOP(@BatchSize) FROM "x".dbo.my_table
				WHERE created_date <= '2020-12-31'
  					  AND (activity_code = 'APPEAL' 
					  OR activity_code = 'EO');
				SET @RowCount = @@ROWCOUNT;
			END
	  COMMIT TRAN;
END TRY
BEGIN CATCH
	ROLLBACK TRANSACTION;

		SELECT 
			ERROR_NUMBER()
			, ERROR_SEVERITY()
			, ERROR_STATE()
			, ERROR_PROCEDURE()
			, ERROR_LINE()
			, ERROR_MESSAGE();
END CATCH

@DocGreenRob
Copy link
Author

DocGreenRob commented Jan 24, 2024

RegEx Find & Replace - (DateTime)

image

CAST\(.*? AS DateTime\)

@DocGreenRob
Copy link
Author

DocGreenRob commented Jan 28, 2024

PrimeIcons

The pi-eye and pi-save are part of the PrimeIcons library, which is used by PrimeFaces and PrimeNG components¹². The icons in this library use the pi pi-{icon} syntax, such as pi pi-check¹².

You can find the full list of available icons in the PrimeIcons library on the official PrimeFaces¹ and PrimeNG² websites. More icons are added periodically, and you can also request new icons at the issue tracker².

Here's an example of how you can use these icons:

<i class="pi pi-check"></i>
<i class="pi pi-times"></i>
<i class="pi pi-eye"></i>
<i class="pi pi-save"></i>

You can control the size of an icon with the font-size property of the element². For example:

<i class="pi pi-check" style="font-size: [1](https://www.primefaces.org/diamond/icons.xhtml)rem"></i>
<i class="pi pi-times" style="font-size: 1.5rem"></i>
<i class="pi pi-eye" style="font-size: [2](https://primeng.org/icons)rem"></i>
<i class="pi pi-save" style="font-size: 2.5rem"></i>

And you can define the icon color with the color property, which is inherited from the parent by default². For example:

<i class="pi pi-check" style="color: slateblue"></i>
<i class="pi pi-times" style="color: green"></i>
<i class="pi pi-eye" style="color: 'var(--primary-color)'></i>
<i class="pi pi-save" style="color: #708090"></i>

Remember to import the CSS file of the icon library in styles.scss of your application²:

@import "primeicons/primeicons.css";

I hope this helps! Let me know if you have any other questions. 😊

Source: Conversation with Bing, 1/27/2024
(1) Icons - PrimeFaces. https://www.primefaces.org/diamond/icons.xhtml.
(2) PrimeNG. https://primeng.org/icons.

@DocGreenRob
Copy link
Author

DocGreenRob commented Feb 10, 2024

Export Azure Web App Configuration Settings

To export the configuration settings from one Azure App Service and import them into another, you can use the Azure CLI or PowerShell. Here's how you can do it using the Azure CLI:

  1. Export Configuration Settings:

    az webapp config appsettings list --name <app-name> --resource-group <resource-group-name> --output json > appsettings.json

    This command will export the application settings of the specified Azure App Service to a JSON file named appsettings.json.

  2. Import Configuration Settings:

    az webapp config appsettings set --name <app-name> --resource-group <resource-group-name> --settings @appsettings.json

    This command will import the application settings from the appsettings.json file and apply them to the specified Azure App Service.

Make sure to replace <app-name> and <resource-group-name> with the appropriate values for your Azure App Service.

By following these steps, you can easily export and import the configuration settings between Azure App Services.


Service Bus Queues and Topics

az servicebus queue list --resource-group cge-rg-cgecoresandbox-dev --namespace-name sbns-cge-hzz-api-dev
az servicebus topic list --resource-group cge-rg-cgecoresandbox-dev --namespace-name sbns-cge-hzz-api-dev

# Define variables
source_rg=cge-rg-cgecoresandbox-dev
source_ns=sbns-cge-hzz-api-dev
target_rg=cge-rg-cgecoresandbox-qa
target_ns=sbns-cge-hzz-api-dev

# Get list of queues from source namespace
queues=$(az servicebus queue list --resource-group $source_rg --namespace-name $source_ns --query "[].name" -o tsv)

# Copy each queue to target namespace
for queue in $queues; do
  az servicebus queue create --resource-group $target_rg --namespace-name $target_ns --name $queue
done

@DocGreenRob
Copy link
Author

To achieve a single endpoint that routes traffic to your app service instances deployed in different regions, you can use Azure Traffic Manager or Azure Front Door. These services provide global load balancing and routing capabilities, allowing you to direct incoming requests to the nearest or most available instance of your app service based on factors such as geographic location, latency, or endpoint health.

Here's how you can set it up:

  1. Deploy your app service instances: Deploy your app service to multiple regions as you normally would.

  2. Configure Azure Traffic Manager or Azure Front Door:

    • Create a new instance of either Azure Traffic Manager or Azure Front Door in your Azure portal.
    • Define the routing method you prefer, such as geographic routing or performance-based routing.
    • Add the endpoints of your app service instances deployed in different regions to the routing configuration.
    • Configure the desired routing behavior, such as priority-based or weighted routing, depending on your requirements.
  3. Update DNS settings: Point your domain's DNS records to the endpoint provided by Azure Traffic Manager or Azure Front Door. This ensures that incoming requests to your domain are routed through the load balancer or traffic manager, which then forwards them to the appropriate app service instance based on the routing rules you've defined.

By using Azure Traffic Manager or Azure Front Door, you can maintain a single endpoint for your application while distributing traffic across multiple app service instances deployed in different regions for improved availability and performance.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 3, 2024

SQL: Find Indexes in DB

To find all indexes for all tables in any database in SQL Server, you can use the following query:

USE [YourDatabaseName]; -- Replace with your actual database name
GO

SELECT 
    DB_NAME() AS DatabaseName,
    SCHEMA_NAME(t.schema_id) AS SchemaName,
    t.name AS TableName,
    ind.name AS IndexName,
    col.name AS ColumnName
FROM 
    sys.tables t
INNER JOIN 
    sys.indexes ind ON t.object_id = ind.object_id
INNER JOIN 
    sys.index_columns ic ON ind.object_id = ic.object_id AND ind.index_id = ic.index_id
INNER JOIN 
    sys.columns col ON t.object_id = col.object_id AND col.column_id = ic.column_id
WHERE 
    ind.index_id > 0 -- Ignore heap tables
    AND ind.is_hypothetical = 0 -- Ignore hypothetical indexes
ORDER BY 
    SchemaName,
    TableName,
    IndexName,
    ic.index_column_id;

Replace YourDatabaseName with the name of your database. You will need to run this script for each database you have.

For the second query, to find tables without indexes in any database, use the following query:

USE [YourDatabaseName]; -- Replace with your actual database name
GO

SELECT 
    DB_NAME() AS DatabaseName,
    SCHEMA_NAME(t.schema_id) AS SchemaName,
    t.name AS TableName
FROM 
    sys.tables t
WHERE 
    NOT EXISTS (
        SELECT 
            1 
        FROM 
            sys.indexes ind 
        WHERE 
            t.object_id = ind.object_id 
            AND ind.index_id > 0 -- Indexes with index_id = 0 are heaps
            AND ind.is_hypothetical = 0 -- Ignore hypothetical indexes
    )
    AND t.type = 'U' -- Only include user tables
ORDER BY 
    SchemaName,
    TableName;

Again, replace YourDatabaseName with the name of your database. This script also needs to be run for each database.

If you are looking to run these queries across all databases on a server, you would need to create a dynamic SQL script that iterates through each database and executes the query. However, this approach is more complex and should be handled with care, as it may have implications for performance and security.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 3, 2024

Manually Rebuilding Indexes

For SQL Server, you can rebuild all indexes on a table with the following command:

ALTER INDEX ALL ON YourTableName REBUILD;

If you have many tables and want to rebuild all indexes on all tables, you can use a cursor to loop through the tables:

DECLARE @TableName VARCHAR(255)

DECLARE TableCursor CURSOR FOR 
SELECT [name]
FROM sys.tables

OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    EXEC('ALTER INDEX ALL ON ' + QUOTENAME(@TableName) + ' REBUILD WITH (ONLINE = ON)')
    FETCH NEXT FROM TableCursor INTO @TableName
END

CLOSE TableCursor
DEALLOCATE TableCursor

Remember to replace YourTableName with the actual table name. The WITH (ONLINE = ON) option allows the index to be rebuilt without taking the underlying table offline. This is only available in SQL Server Enterprise edition.

Automating Index Rebuilds with a Job

In SQL Server, you can schedule a job using SQL Server Agent:

  1. Open SQL Server Management Studio (SSMS).
  2. Connect to your SQL Server instance.
  3. Expand the "SQL Server Agent" node.
  4. Right-click on "Jobs" and choose "New Job...".
  5. Name your job and provide a description.
  6. Go to the "Steps" page and create a new step.
  7. Name the step and set the type to "Transact-SQL script (T-SQL)".
  8. Paste your index rebuild script into the command window.
  9. Go to the "Schedules" page and create a new schedule.
  10. Set the schedule to run daily at a time when the database load is low, like during the night.

Here is an example T-SQL script for the job step that rebuilds all indexes on all user tables:

DECLARE @TableName VARCHAR(255)
DECLARE @Sql NVARCHAR(1000)

DECLARE TableCursor CURSOR FOR 
SELECT [name]
FROM sys.tables

OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    SET @Sql = N'ALTER INDEX ALL ON ' + QUOTENAME(@TableName) + ' REBUILD WITH (ONLINE = ON)'
    EXEC sp_executesql @Sql
    FETCH NEXT FROM TableCursor INTO @TableName
END

CLOSE TableCursor
DEALLOCATE TableCursor

Make sure you have proper backups and understand that index rebuilding can be resource-intensive. Monitor the job execution to ensure it doesn't adversely affect production workloads.

Also, consider whether you need to rebuild all indexes; sometimes, it's better to analyze index usage and fragmentation levels and only rebuild or reorganize where necessary. There are scripts available to help with this, such as Ola Hallengren's SQL Server Maintenance Solution.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 16, 2024

Storing data in crystals

Storing data in crystals, specifically using 5D optical data storage in nanostructured glass, is a fascinating and cutting-edge technology. However, it's important to note that this technology is still in the experimental stage and not yet available for everyday use, like storing a "Hello World" program in a crystal.

The process involves using ultrafast lasers to create microscopic structures in fused quartz. These structures, representing the stored data, can be manipulated in five dimensions: three spatial dimensions plus two additional configurations related to the orientation and size of the nanostructures. The University of Southampton has been at the forefront of this research. Their technique allows for incredibly dense and long-lasting data storage – up to 360 terabytes of data can potentially be stored in a disc the size of a conventional CD, with a lifespan of up to 13.8 billion years.

Reading the data requires specific equipment like an optical microscope and a polarizer due to the microscopic and complex nature of the data encoding.

Meanwhile, Project Silica by Microsoft is exploring similar technology, focusing on storing data in quartz glass for long-term archival purposes. They aim to create a storage technology that's both sustainable and secure, using a unique approach that involves writing data with femtosecond lasers and reading it with polarization-sensitive microscopy.

It's an exciting field, but for now, these technologies are mainly in the research phase and not yet available for personal computing or general programming applications. Keep an eye on the progress in this area, as it holds great potential for future data storage solutions!

@DocGreenRob
Copy link
Author

Certainly! Here are 10 .NET and software engineering-themed jokes for you and your coworkers:

  1. Why do C# and VB.NET developers stay away from the beach? Because they can't handle unmanaged sandboxes.

  2. Why was the JavaScript developer sad? Because he didn't Node how to Express himself!

  3. Why do programmers always mix up Halloween and Christmas? Because Oct 31 equals Dec 25.

  4. Why did the .NET developer feel boxed in? Because he was surrounded by too many frameworks!

  5. How many software engineers does it take to change a light bulb? None, that's a hardware problem.

  6. Why did the software engineer go to art school? To learn how to handle exceptions beautifully.

  7. What do you call a .NET developer who writes clean code? A C#arp individual.

  8. Why don't .NET developers play hide and seek? Because good luck hiding when the garbage collector is running.

  9. How do you praise a .NET developer? "You're so good, you must have a really strong Framework of knowledge!"

  10. Why was the ASP.NET website so alert? Because it was always awaiting requests.

Enjoy sharing these with your team! 😄

@DocGreenRob
Copy link
Author

You can use the git branch command in PowerShell to list all branches in your repository. To exclude specific branches like 'master' and 'branch one', you can use the grep -v command which inverts the match. Here is the command you can use:

git branch | grep -v "master\|branch one"

This command will list all the branches in your current repository except for 'master' and 'branch one'. Please replace 'master' and 'branch one' with the actual names of the branches you want to exclude if they are different.

Please note that this command requires Git Bash utilities. If you're using pure PowerShell, you might need to use a different approach as grep is not a built-in command in PowerShell. Here's how you can do it in pure PowerShell:

git branch | Where-Object {$_ -notmatch 'master|branch one'}

This command does the same thing: it lists all branches except 'master' and 'branch one'. The Where-Object cmdlet in PowerShell is used to filter out objects from a collection that do not meet a certain condition. In this case, the condition is that the branch name does not match 'master' or 'branch one'.

@DocGreenRob
Copy link
Author

You can delete all branches in your repository except for 'master', 'branch one', and the currently checked out branch using the following command in PowerShell:

git branch | ForEach-Object { if ($_ -notmatch '\*|master|branch one') { git branch -d $_.trim() } }

This command does the following:

  • git branch lists all branches.
  • ForEach-Object loops over each branch.
  • if ($_ -notmatch '\*|master|branch one') checks if the branch is not the currently checked out branch (represented by *) and is not 'master' or 'branch one'.
  • git branch -d $_.trim() deletes the branch if the condition is met.

Please replace 'master' and 'branch one' with the actual names of the branches you want to keep if they are different.

Note: This command will only delete branches that have been merged. If you want to delete all branches regardless of whether they have been merged or not, you can use the -D option instead of -d.

git branch | ForEach-Object { if ($_ -notmatch '\*|master|branch one') { git branch -D $_.trim() } }

Be careful with this command as it will delete branches even if they contain unmerged changes. Always make sure to check the branches and their changes before deleting them. It's a good practice to keep your work committed and pushed to a remote repository to avoid any loss of work.

@DocGreenRob
Copy link
Author

$WShell = New-Object -Com Wscript.Shell
while (1) {$WShell.SendKeys("{SCROLLLOCK}"); sleep 60}

@DocGreenRob
Copy link
Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment