Skip to content

Instantly share code, notes, and snippets.

@DocGreenRob
Last active May 5, 2025 21:45
Show Gist options
  • Save DocGreenRob/a7f08d56bae684f7f00583f446c27e8a to your computer and use it in GitHub Desktop.
Save DocGreenRob/a7f08d56bae684f7f00583f446c27e8a to your computer and use it in GitHub Desktop.
.Net Engineer Pro Tools
Windows Pro Tips
-----------------
powertoys - https://apps.microsoft.com/store/detail/microsoft-powertoys/XP89DCGQ3K6VLD
devtoys - https://apps.microsoft.com/store/detail/devtoys/9PGCV4V3BK4W
Visual Studio 2022 Pro Tips
---------------------------
vscoloroutput - https://marketplace.visualstudio.com/items?itemName=MikeWard-AnnArbor.VSColorOutput
solutionColor - https://marketplace.visualstudio.com/items?itemName=Wumpf.SolutionColor
save vs settings to apply to other computer - https://learn.microsoft.com/en-us/visualstudio/install/import-export-installation-configurations?view=vs-2022
Podcasts
--------
Dev interrupted
Hacking Humans
Cyber Security Headlines
Click Here
Malicious Life
The Stack Overflow Podcast
The Backend Engineering (with Hussein Nasser)
The Changelog: Software Development, Open Source
Tech Stuff
Cyberwire Daily
Techmeme Ride Home
Soft Skills Engineering
Syntax - Tasty Web Development Treats
Cyber Security Today
Software Engineering Daily
Developer Tea
Coding Blocks .NET
The Cloud Cast
JS Party: Javascript, CSS, Web Development
Go Time: Golang, Software Engineering
Cyber
Dev Questions with Tim Corey
Thoughtworks Technology Podcast
.NET Rocks!
Smashing Security
Hanselminutes with Scott Hanselman
Software Engineering
Talk Python To Me
Security Now
Darknet Diaries
Hacked
The .NET Core Podcast
The .NET MAUI Podcast
Kubernetes Podcast from Google
Adventures in .NET
Coding After Work
Base.cs Podcast
The Static Void Podcast
Tools
------
couchbase
honeycomb.io/changelog
firehydrant
logrocket
playwright
openmct
thundra.io
raygun
fly.io
appwrite
sentry.io
https://sourcegraph.com/
https://www.kolide.com/
https://entity.services/
WeekPlan
Docker Extensions
------------------
Ddosify - High-performance load testing tool
- https://github.com/ddosify/ddosify
BurpSuite
- https://portswigger.net/burp
- https://danaepp.com/
VS Tips
--------
Extract method from selected code
- Ctrl + R + M
Ctrl + K + D
Ctrl + R + G
Ctrl + M + Z (Code Maid)
Important
----------
ApplicationInsights SamplingSettings for AzFn
- https://learn.microsoft.com/en-us/azure/azure-functions/functions-host-json
Design Patterns in C#
- https://www.dofactory.com/net/factory-method-design-pattern
- https://github.com/DovAmir/awesome-design-patterns?utm_source=programmingdigest&utm_medium&utm_campaign=1493
Shopify Query
- https://shopify.engineering/reducing-bigquery-costs?utm_source=programmingdigest&utm_medium&utm_campaign=1403
Building Own Operating System
- https://o-oconnell.github.io/2023/01/12/p1os.html?utm_source=programmingdigest&utm_medium&utm_campaign=1493
Debugging Linq
- https://www.red-gate.com/simple-talk/development/dotnet-development/linq-secrets-revealed-chaining-and-debugging/
--> https://michaelscodingspot.com/debug-linq-in-csharp/
Bleeping Computer
- https://www.bleepingcomputer.com/
Utilities
---------
Handle v5.0
- https://learn.microsoft.com/en-us/sysinternals/downloads/handle?WT.mc_id=DT-MVP-5003978
Auto Increment Build #
- https://stackoverflow.com/questions/826777/how-to-have-an-auto-incrementing-version-number-visual-studio
Phylosophy
----------
1. Do I have to have a "purpose" to have an address in the USA?
- if yes, then as a Human being I must have a purpose? Seriously? Ok, a purpose to whom? To whom must I state my pupose or execute or report to about...???
2. System Failure - Zero Day Exploit
3. Good PR example - https://github.com/dotnet/aspnetcore/pull/45587/files
App Insights Log Queries
------------------------
availabilityResults
| where timestamp > datetime("2022-12-19T04:07:00.000Z") and timestamp < datetime("2022-12-20T04:07:00.000Z")
| where customDimensions["WebtestArmResourceName"] == "availability-test-1-app-notepad-physical-activity-dev-eastus"
| where true and true
| extend percentage = toint(success) * 100
| summarize avg(percentage) by bin(timestamp, 1h)
| render timechart
******************************************************************
@DocGreenRob
Copy link
Author

To achieve a single endpoint that routes traffic to your app service instances deployed in different regions, you can use Azure Traffic Manager or Azure Front Door. These services provide global load balancing and routing capabilities, allowing you to direct incoming requests to the nearest or most available instance of your app service based on factors such as geographic location, latency, or endpoint health.

Here's how you can set it up:

  1. Deploy your app service instances: Deploy your app service to multiple regions as you normally would.

  2. Configure Azure Traffic Manager or Azure Front Door:

    • Create a new instance of either Azure Traffic Manager or Azure Front Door in your Azure portal.
    • Define the routing method you prefer, such as geographic routing or performance-based routing.
    • Add the endpoints of your app service instances deployed in different regions to the routing configuration.
    • Configure the desired routing behavior, such as priority-based or weighted routing, depending on your requirements.
  3. Update DNS settings: Point your domain's DNS records to the endpoint provided by Azure Traffic Manager or Azure Front Door. This ensures that incoming requests to your domain are routed through the load balancer or traffic manager, which then forwards them to the appropriate app service instance based on the routing rules you've defined.

By using Azure Traffic Manager or Azure Front Door, you can maintain a single endpoint for your application while distributing traffic across multiple app service instances deployed in different regions for improved availability and performance.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 3, 2024

SQL: Find Indexes in DB

To find all indexes for all tables in any database in SQL Server, you can use the following query:

USE [YourDatabaseName]; -- Replace with your actual database name
GO

SELECT 
    DB_NAME() AS DatabaseName,
    SCHEMA_NAME(t.schema_id) AS SchemaName,
    t.name AS TableName,
    ind.name AS IndexName,
    col.name AS ColumnName
FROM 
    sys.tables t
INNER JOIN 
    sys.indexes ind ON t.object_id = ind.object_id
INNER JOIN 
    sys.index_columns ic ON ind.object_id = ic.object_id AND ind.index_id = ic.index_id
INNER JOIN 
    sys.columns col ON t.object_id = col.object_id AND col.column_id = ic.column_id
WHERE 
    ind.index_id > 0 -- Ignore heap tables
    AND ind.is_hypothetical = 0 -- Ignore hypothetical indexes
ORDER BY 
    SchemaName,
    TableName,
    IndexName,
    ic.index_column_id;

Replace YourDatabaseName with the name of your database. You will need to run this script for each database you have.

For the second query, to find tables without indexes in any database, use the following query:

USE [YourDatabaseName]; -- Replace with your actual database name
GO

SELECT 
    DB_NAME() AS DatabaseName,
    SCHEMA_NAME(t.schema_id) AS SchemaName,
    t.name AS TableName
FROM 
    sys.tables t
WHERE 
    NOT EXISTS (
        SELECT 
            1 
        FROM 
            sys.indexes ind 
        WHERE 
            t.object_id = ind.object_id 
            AND ind.index_id > 0 -- Indexes with index_id = 0 are heaps
            AND ind.is_hypothetical = 0 -- Ignore hypothetical indexes
    )
    AND t.type = 'U' -- Only include user tables
ORDER BY 
    SchemaName,
    TableName;

Again, replace YourDatabaseName with the name of your database. This script also needs to be run for each database.

If you are looking to run these queries across all databases on a server, you would need to create a dynamic SQL script that iterates through each database and executes the query. However, this approach is more complex and should be handled with care, as it may have implications for performance and security.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 3, 2024

Manually Rebuilding Indexes

For SQL Server, you can rebuild all indexes on a table with the following command:

ALTER INDEX ALL ON YourTableName REBUILD;

If you have many tables and want to rebuild all indexes on all tables, you can use a cursor to loop through the tables:

DECLARE @TableName VARCHAR(255)

DECLARE TableCursor CURSOR FOR 
SELECT [name]
FROM sys.tables

OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    EXEC('ALTER INDEX ALL ON ' + QUOTENAME(@TableName) + ' REBUILD WITH (ONLINE = ON)')
    FETCH NEXT FROM TableCursor INTO @TableName
END

CLOSE TableCursor
DEALLOCATE TableCursor

Remember to replace YourTableName with the actual table name. The WITH (ONLINE = ON) option allows the index to be rebuilt without taking the underlying table offline. This is only available in SQL Server Enterprise edition.

Automating Index Rebuilds with a Job

In SQL Server, you can schedule a job using SQL Server Agent:

  1. Open SQL Server Management Studio (SSMS).
  2. Connect to your SQL Server instance.
  3. Expand the "SQL Server Agent" node.
  4. Right-click on "Jobs" and choose "New Job...".
  5. Name your job and provide a description.
  6. Go to the "Steps" page and create a new step.
  7. Name the step and set the type to "Transact-SQL script (T-SQL)".
  8. Paste your index rebuild script into the command window.
  9. Go to the "Schedules" page and create a new schedule.
  10. Set the schedule to run daily at a time when the database load is low, like during the night.

Here is an example T-SQL script for the job step that rebuilds all indexes on all user tables:

DECLARE @TableName VARCHAR(255)
DECLARE @Sql NVARCHAR(1000)

DECLARE TableCursor CURSOR FOR 
SELECT [name]
FROM sys.tables

OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    SET @Sql = N'ALTER INDEX ALL ON ' + QUOTENAME(@TableName) + ' REBUILD WITH (ONLINE = ON)'
    EXEC sp_executesql @Sql
    FETCH NEXT FROM TableCursor INTO @TableName
END

CLOSE TableCursor
DEALLOCATE TableCursor

Make sure you have proper backups and understand that index rebuilding can be resource-intensive. Monitor the job execution to ensure it doesn't adversely affect production workloads.

Also, consider whether you need to rebuild all indexes; sometimes, it's better to analyze index usage and fragmentation levels and only rebuild or reorganize where necessary. There are scripts available to help with this, such as Ola Hallengren's SQL Server Maintenance Solution.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 16, 2024

Storing data in crystals

Storing data in crystals, specifically using 5D optical data storage in nanostructured glass, is a fascinating and cutting-edge technology. However, it's important to note that this technology is still in the experimental stage and not yet available for everyday use, like storing a "Hello World" program in a crystal.

The process involves using ultrafast lasers to create microscopic structures in fused quartz. These structures, representing the stored data, can be manipulated in five dimensions: three spatial dimensions plus two additional configurations related to the orientation and size of the nanostructures. The University of Southampton has been at the forefront of this research. Their technique allows for incredibly dense and long-lasting data storage – up to 360 terabytes of data can potentially be stored in a disc the size of a conventional CD, with a lifespan of up to 13.8 billion years.

Reading the data requires specific equipment like an optical microscope and a polarizer due to the microscopic and complex nature of the data encoding.

Meanwhile, Project Silica by Microsoft is exploring similar technology, focusing on storing data in quartz glass for long-term archival purposes. They aim to create a storage technology that's both sustainable and secure, using a unique approach that involves writing data with femtosecond lasers and reading it with polarization-sensitive microscopy.

It's an exciting field, but for now, these technologies are mainly in the research phase and not yet available for personal computing or general programming applications. Keep an eye on the progress in this area, as it holds great potential for future data storage solutions!

@DocGreenRob
Copy link
Author

Certainly! Here are 10 .NET and software engineering-themed jokes for you and your coworkers:

  1. Why do C# and VB.NET developers stay away from the beach? Because they can't handle unmanaged sandboxes.

  2. Why was the JavaScript developer sad? Because he didn't Node how to Express himself!

  3. Why do programmers always mix up Halloween and Christmas? Because Oct 31 equals Dec 25.

  4. Why did the .NET developer feel boxed in? Because he was surrounded by too many frameworks!

  5. How many software engineers does it take to change a light bulb? None, that's a hardware problem.

  6. Why did the software engineer go to art school? To learn how to handle exceptions beautifully.

  7. What do you call a .NET developer who writes clean code? A C#arp individual.

  8. Why don't .NET developers play hide and seek? Because good luck hiding when the garbage collector is running.

  9. How do you praise a .NET developer? "You're so good, you must have a really strong Framework of knowledge!"

  10. Why was the ASP.NET website so alert? Because it was always awaiting requests.

Enjoy sharing these with your team! 😄

@DocGreenRob
Copy link
Author

You can use the git branch command in PowerShell to list all branches in your repository. To exclude specific branches like 'master' and 'branch one', you can use the grep -v command which inverts the match. Here is the command you can use:

git branch | grep -v "master\|branch one"

This command will list all the branches in your current repository except for 'master' and 'branch one'. Please replace 'master' and 'branch one' with the actual names of the branches you want to exclude if they are different.

Please note that this command requires Git Bash utilities. If you're using pure PowerShell, you might need to use a different approach as grep is not a built-in command in PowerShell. Here's how you can do it in pure PowerShell:

git branch | Where-Object {$_ -notmatch 'master|branch one'}

This command does the same thing: it lists all branches except 'master' and 'branch one'. The Where-Object cmdlet in PowerShell is used to filter out objects from a collection that do not meet a certain condition. In this case, the condition is that the branch name does not match 'master' or 'branch one'.

@DocGreenRob
Copy link
Author

You can delete all branches in your repository except for 'master', 'branch one', and the currently checked out branch using the following command in PowerShell:

git branch | ForEach-Object { if ($_ -notmatch '\*|master|branch one') { git branch -d $_.trim() } }

This command does the following:

  • git branch lists all branches.
  • ForEach-Object loops over each branch.
  • if ($_ -notmatch '\*|master|branch one') checks if the branch is not the currently checked out branch (represented by *) and is not 'master' or 'branch one'.
  • git branch -d $_.trim() deletes the branch if the condition is met.

Please replace 'master' and 'branch one' with the actual names of the branches you want to keep if they are different.

Note: This command will only delete branches that have been merged. If you want to delete all branches regardless of whether they have been merged or not, you can use the -D option instead of -d.

git branch | ForEach-Object { if ($_ -notmatch '\*|master|branch one') { git branch -D $_.trim() } }

Be careful with this command as it will delete branches even if they contain unmerged changes. Always make sure to check the branches and their changes before deleting them. It's a good practice to keep your work committed and pushed to a remote repository to avoid any loss of work.

@DocGreenRob
Copy link
Author

$WShell = New-Object -Com Wscript.Shell
while (1) {$WShell.SendKeys("{SCROLLLOCK}"); sleep 60}

@DocGreenRob
Copy link
Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment