Skip to content

Instantly share code, notes, and snippets.

@DocGreenRob
Last active October 17, 2025 15:54
Show Gist options
  • Save DocGreenRob/a7f08d56bae684f7f00583f446c27e8a to your computer and use it in GitHub Desktop.
Save DocGreenRob/a7f08d56bae684f7f00583f446c27e8a to your computer and use it in GitHub Desktop.
.Net Engineer Pro Tools
Windows Pro Tips
-----------------
powertoys - https://apps.microsoft.com/store/detail/microsoft-powertoys/XP89DCGQ3K6VLD
devtoys - https://apps.microsoft.com/store/detail/devtoys/9PGCV4V3BK4W
Visual Studio 2022 Pro Tips
---------------------------
vscoloroutput - https://marketplace.visualstudio.com/items?itemName=MikeWard-AnnArbor.VSColorOutput
solutionColor - https://marketplace.visualstudio.com/items?itemName=Wumpf.SolutionColor
save vs settings to apply to other computer - https://learn.microsoft.com/en-us/visualstudio/install/import-export-installation-configurations?view=vs-2022
Podcasts
--------
Dev interrupted
Hacking Humans
Cyber Security Headlines
Click Here
Malicious Life
The Stack Overflow Podcast
The Backend Engineering (with Hussein Nasser)
The Changelog: Software Development, Open Source
Tech Stuff
Cyberwire Daily
Techmeme Ride Home
Soft Skills Engineering
Syntax - Tasty Web Development Treats
Cyber Security Today
Software Engineering Daily
Developer Tea
Coding Blocks .NET
The Cloud Cast
JS Party: Javascript, CSS, Web Development
Go Time: Golang, Software Engineering
Cyber
Dev Questions with Tim Corey
Thoughtworks Technology Podcast
.NET Rocks!
Smashing Security
Hanselminutes with Scott Hanselman
Software Engineering
Talk Python To Me
Security Now
Darknet Diaries
Hacked
The .NET Core Podcast
The .NET MAUI Podcast
Kubernetes Podcast from Google
Adventures in .NET
Coding After Work
Base.cs Podcast
The Static Void Podcast
Tools
------
couchbase
honeycomb.io/changelog
firehydrant
logrocket
playwright
openmct
thundra.io
raygun
fly.io
appwrite
sentry.io
https://sourcegraph.com/
https://www.kolide.com/
https://entity.services/
WeekPlan
Docker Extensions
------------------
Ddosify - High-performance load testing tool
- https://github.com/ddosify/ddosify
BurpSuite
- https://portswigger.net/burp
- https://danaepp.com/
VS Tips
--------
Extract method from selected code
- Ctrl + R + M
Ctrl + K + D
Ctrl + R + G
Ctrl + M + Z (Code Maid)
Important
----------
ApplicationInsights SamplingSettings for AzFn
- https://learn.microsoft.com/en-us/azure/azure-functions/functions-host-json
Design Patterns in C#
- https://www.dofactory.com/net/factory-method-design-pattern
- https://github.com/DovAmir/awesome-design-patterns?utm_source=programmingdigest&utm_medium&utm_campaign=1493
Shopify Query
- https://shopify.engineering/reducing-bigquery-costs?utm_source=programmingdigest&utm_medium&utm_campaign=1403
Building Own Operating System
- https://o-oconnell.github.io/2023/01/12/p1os.html?utm_source=programmingdigest&utm_medium&utm_campaign=1493
Debugging Linq
- https://www.red-gate.com/simple-talk/development/dotnet-development/linq-secrets-revealed-chaining-and-debugging/
--> https://michaelscodingspot.com/debug-linq-in-csharp/
Bleeping Computer
- https://www.bleepingcomputer.com/
Utilities
---------
Handle v5.0
- https://learn.microsoft.com/en-us/sysinternals/downloads/handle?WT.mc_id=DT-MVP-5003978
Auto Increment Build #
- https://stackoverflow.com/questions/826777/how-to-have-an-auto-incrementing-version-number-visual-studio
Phylosophy
----------
1. Do I have to have a "purpose" to have an address in the USA?
- if yes, then as a Human being I must have a purpose? Seriously? Ok, a purpose to whom? To whom must I state my pupose or execute or report to about...???
2. System Failure - Zero Day Exploit
3. Good PR example - https://github.com/dotnet/aspnetcore/pull/45587/files
App Insights Log Queries
------------------------
availabilityResults
| where timestamp > datetime("2022-12-19T04:07:00.000Z") and timestamp < datetime("2022-12-20T04:07:00.000Z")
| where customDimensions["WebtestArmResourceName"] == "availability-test-1-app-notepad-physical-activity-dev-eastus"
| where true and true
| extend percentage = toint(success) * 100
| summarize avg(percentage) by bin(timestamp, 1h)
| render timechart
******************************************************************
@DocGreenRob
Copy link
Author

To achieve a single endpoint that routes traffic to your app service instances deployed in different regions, you can use Azure Traffic Manager or Azure Front Door. These services provide global load balancing and routing capabilities, allowing you to direct incoming requests to the nearest or most available instance of your app service based on factors such as geographic location, latency, or endpoint health.

Here's how you can set it up:

  1. Deploy your app service instances: Deploy your app service to multiple regions as you normally would.

  2. Configure Azure Traffic Manager or Azure Front Door:

    • Create a new instance of either Azure Traffic Manager or Azure Front Door in your Azure portal.
    • Define the routing method you prefer, such as geographic routing or performance-based routing.
    • Add the endpoints of your app service instances deployed in different regions to the routing configuration.
    • Configure the desired routing behavior, such as priority-based or weighted routing, depending on your requirements.
  3. Update DNS settings: Point your domain's DNS records to the endpoint provided by Azure Traffic Manager or Azure Front Door. This ensures that incoming requests to your domain are routed through the load balancer or traffic manager, which then forwards them to the appropriate app service instance based on the routing rules you've defined.

By using Azure Traffic Manager or Azure Front Door, you can maintain a single endpoint for your application while distributing traffic across multiple app service instances deployed in different regions for improved availability and performance.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 3, 2024

SQL: Find Indexes in DB

To find all indexes for all tables in any database in SQL Server, you can use the following query:

USE [YourDatabaseName]; -- Replace with your actual database name
GO

SELECT 
    DB_NAME() AS DatabaseName,
    SCHEMA_NAME(t.schema_id) AS SchemaName,
    t.name AS TableName,
    ind.name AS IndexName,
    col.name AS ColumnName
FROM 
    sys.tables t
INNER JOIN 
    sys.indexes ind ON t.object_id = ind.object_id
INNER JOIN 
    sys.index_columns ic ON ind.object_id = ic.object_id AND ind.index_id = ic.index_id
INNER JOIN 
    sys.columns col ON t.object_id = col.object_id AND col.column_id = ic.column_id
WHERE 
    ind.index_id > 0 -- Ignore heap tables
    AND ind.is_hypothetical = 0 -- Ignore hypothetical indexes
ORDER BY 
    SchemaName,
    TableName,
    IndexName,
    ic.index_column_id;

Replace YourDatabaseName with the name of your database. You will need to run this script for each database you have.

For the second query, to find tables without indexes in any database, use the following query:

USE [YourDatabaseName]; -- Replace with your actual database name
GO

SELECT 
    DB_NAME() AS DatabaseName,
    SCHEMA_NAME(t.schema_id) AS SchemaName,
    t.name AS TableName
FROM 
    sys.tables t
WHERE 
    NOT EXISTS (
        SELECT 
            1 
        FROM 
            sys.indexes ind 
        WHERE 
            t.object_id = ind.object_id 
            AND ind.index_id > 0 -- Indexes with index_id = 0 are heaps
            AND ind.is_hypothetical = 0 -- Ignore hypothetical indexes
    )
    AND t.type = 'U' -- Only include user tables
ORDER BY 
    SchemaName,
    TableName;

Again, replace YourDatabaseName with the name of your database. This script also needs to be run for each database.

If you are looking to run these queries across all databases on a server, you would need to create a dynamic SQL script that iterates through each database and executes the query. However, this approach is more complex and should be handled with care, as it may have implications for performance and security.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 3, 2024

Manually Rebuilding Indexes

For SQL Server, you can rebuild all indexes on a table with the following command:

ALTER INDEX ALL ON YourTableName REBUILD;

If you have many tables and want to rebuild all indexes on all tables, you can use a cursor to loop through the tables:

DECLARE @TableName VARCHAR(255)

DECLARE TableCursor CURSOR FOR 
SELECT [name]
FROM sys.tables

OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    EXEC('ALTER INDEX ALL ON ' + QUOTENAME(@TableName) + ' REBUILD WITH (ONLINE = ON)')
    FETCH NEXT FROM TableCursor INTO @TableName
END

CLOSE TableCursor
DEALLOCATE TableCursor

Remember to replace YourTableName with the actual table name. The WITH (ONLINE = ON) option allows the index to be rebuilt without taking the underlying table offline. This is only available in SQL Server Enterprise edition.

Automating Index Rebuilds with a Job

In SQL Server, you can schedule a job using SQL Server Agent:

  1. Open SQL Server Management Studio (SSMS).
  2. Connect to your SQL Server instance.
  3. Expand the "SQL Server Agent" node.
  4. Right-click on "Jobs" and choose "New Job...".
  5. Name your job and provide a description.
  6. Go to the "Steps" page and create a new step.
  7. Name the step and set the type to "Transact-SQL script (T-SQL)".
  8. Paste your index rebuild script into the command window.
  9. Go to the "Schedules" page and create a new schedule.
  10. Set the schedule to run daily at a time when the database load is low, like during the night.

Here is an example T-SQL script for the job step that rebuilds all indexes on all user tables:

DECLARE @TableName VARCHAR(255)
DECLARE @Sql NVARCHAR(1000)

DECLARE TableCursor CURSOR FOR 
SELECT [name]
FROM sys.tables

OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName

WHILE @@FETCH_STATUS = 0
BEGIN
    SET @Sql = N'ALTER INDEX ALL ON ' + QUOTENAME(@TableName) + ' REBUILD WITH (ONLINE = ON)'
    EXEC sp_executesql @Sql
    FETCH NEXT FROM TableCursor INTO @TableName
END

CLOSE TableCursor
DEALLOCATE TableCursor

Make sure you have proper backups and understand that index rebuilding can be resource-intensive. Monitor the job execution to ensure it doesn't adversely affect production workloads.

Also, consider whether you need to rebuild all indexes; sometimes, it's better to analyze index usage and fragmentation levels and only rebuild or reorganize where necessary. There are scripts available to help with this, such as Ola Hallengren's SQL Server Maintenance Solution.

@DocGreenRob
Copy link
Author

DocGreenRob commented Mar 16, 2024

Storing data in crystals

Storing data in crystals, specifically using 5D optical data storage in nanostructured glass, is a fascinating and cutting-edge technology. However, it's important to note that this technology is still in the experimental stage and not yet available for everyday use, like storing a "Hello World" program in a crystal.

The process involves using ultrafast lasers to create microscopic structures in fused quartz. These structures, representing the stored data, can be manipulated in five dimensions: three spatial dimensions plus two additional configurations related to the orientation and size of the nanostructures. The University of Southampton has been at the forefront of this research. Their technique allows for incredibly dense and long-lasting data storage – up to 360 terabytes of data can potentially be stored in a disc the size of a conventional CD, with a lifespan of up to 13.8 billion years.

Reading the data requires specific equipment like an optical microscope and a polarizer due to the microscopic and complex nature of the data encoding.

Meanwhile, Project Silica by Microsoft is exploring similar technology, focusing on storing data in quartz glass for long-term archival purposes. They aim to create a storage technology that's both sustainable and secure, using a unique approach that involves writing data with femtosecond lasers and reading it with polarization-sensitive microscopy.

It's an exciting field, but for now, these technologies are mainly in the research phase and not yet available for personal computing or general programming applications. Keep an eye on the progress in this area, as it holds great potential for future data storage solutions!

@DocGreenRob
Copy link
Author

Certainly! Here are 10 .NET and software engineering-themed jokes for you and your coworkers:

  1. Why do C# and VB.NET developers stay away from the beach? Because they can't handle unmanaged sandboxes.

  2. Why was the JavaScript developer sad? Because he didn't Node how to Express himself!

  3. Why do programmers always mix up Halloween and Christmas? Because Oct 31 equals Dec 25.

  4. Why did the .NET developer feel boxed in? Because he was surrounded by too many frameworks!

  5. How many software engineers does it take to change a light bulb? None, that's a hardware problem.

  6. Why did the software engineer go to art school? To learn how to handle exceptions beautifully.

  7. What do you call a .NET developer who writes clean code? A C#arp individual.

  8. Why don't .NET developers play hide and seek? Because good luck hiding when the garbage collector is running.

  9. How do you praise a .NET developer? "You're so good, you must have a really strong Framework of knowledge!"

  10. Why was the ASP.NET website so alert? Because it was always awaiting requests.

Enjoy sharing these with your team! 😄

@DocGreenRob
Copy link
Author

You can use the git branch command in PowerShell to list all branches in your repository. To exclude specific branches like 'master' and 'branch one', you can use the grep -v command which inverts the match. Here is the command you can use:

git branch | grep -v "master\|branch one"

This command will list all the branches in your current repository except for 'master' and 'branch one'. Please replace 'master' and 'branch one' with the actual names of the branches you want to exclude if they are different.

Please note that this command requires Git Bash utilities. If you're using pure PowerShell, you might need to use a different approach as grep is not a built-in command in PowerShell. Here's how you can do it in pure PowerShell:

git branch | Where-Object {$_ -notmatch 'master|branch one'}

This command does the same thing: it lists all branches except 'master' and 'branch one'. The Where-Object cmdlet in PowerShell is used to filter out objects from a collection that do not meet a certain condition. In this case, the condition is that the branch name does not match 'master' or 'branch one'.

@DocGreenRob
Copy link
Author

You can delete all branches in your repository except for 'master', 'branch one', and the currently checked out branch using the following command in PowerShell:

git branch | ForEach-Object { if ($_ -notmatch '\*|master|branch one') { git branch -d $_.trim() } }

This command does the following:

  • git branch lists all branches.
  • ForEach-Object loops over each branch.
  • if ($_ -notmatch '\*|master|branch one') checks if the branch is not the currently checked out branch (represented by *) and is not 'master' or 'branch one'.
  • git branch -d $_.trim() deletes the branch if the condition is met.

Please replace 'master' and 'branch one' with the actual names of the branches you want to keep if they are different.

Note: This command will only delete branches that have been merged. If you want to delete all branches regardless of whether they have been merged or not, you can use the -D option instead of -d.

git branch | ForEach-Object { if ($_ -notmatch '\*|master|branch one') { git branch -D $_.trim() } }

Be careful with this command as it will delete branches even if they contain unmerged changes. Always make sure to check the branches and their changes before deleting them. It's a good practice to keep your work committed and pushed to a remote repository to avoid any loss of work.

@DocGreenRob
Copy link
Author

$WShell = New-Object -Com Wscript.Shell
while (1) {$WShell.SendKeys("{SCROLLLOCK}"); sleep 60}

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

Great question — this gets to the heart of modern software architecture vs. legacy (but sometimes still valid) approaches.

Let’s break it down clearly:


⚖️ Stored Procedure vs. Business Logic in Application Code

Aspect Stored Procedures Application-Level Logic (e.g., LINQ, EF, Services)
Performance ✅ Very fast for set-based operations
✅ Precompiled and optimized by SQL Server
⚠️ May be slower for large data sets unless optimized
❌ Not precompiled, relies on EF query generation
Maintainability ❌ Harder to version & maintain
❌ Poor readability, especially for complex logic
❌ Requires DBA or SQL expertise
✅ Easier to version in Git
✅ More readable & testable
✅ Fits with modern CI/CD pipelines
Testability ❌ Difficult to write unit tests
✅ Can write integration tests, but hard to mock
✅ Fully testable with mocks, stubs, unit tests
✅ Better coverage & automation
Deployment ❌ Requires direct DB script deployment
❌ Harder to rollback or track versions
✅ Deploys with the app code
✅ Works with CI/CD pipelines, git-controlled
Reusability ❌ Not easily shared across services unless all touch the same DB ✅ Code reuse through services, interfaces, shared libraries
Security ✅ Can restrict access to just stored procs (minimize SQL injection) ⚠️ Needs more care (e.g., parameterized queries, validation)
✅ EF Core helps with SQL injection
Portability ❌ Tied to a specific RDBMS (e.g., SQL Server T-SQL) ✅ Works across SQL Server, PostgreSQL, MySQL, etc. if ORM is used
Version Control ❌ Often forgotten or manually managed ✅ Git handles code changes, history, rollbacks
Business Logic Location ❌ Logic is in the DB — makes code harder to understand holistically ✅ Logic is centralized in app code — clearer architecture
Legacy Support ✅ Works well in older architectures
✅ Can encapsulate legacy business logic
✅ Preferred for greenfield and modern microservices architectures

💡 Summary

  • Stored Procedures are great for:

    • High-performance batch jobs
    • Legacy systems
    • Tight DBA control environments
    • Data integrity enforcement close to the data
  • App-Level Logic (like what you're doing) is better for:

    • Testability (unit tests, mocking, automation)
    • Maintainability (Git, version control)
    • Scalability (microservices, clean architecture)
    • Developer productivity (modern tooling, CI/CD)

🧠 Final Advice

You're doing the modern, clean approach by keeping business logic out of the DB and inside service layers and EF/LINQ logic. This gives you better test coverage, source control, and clarity across your team.

If you need performance on large data ops, you can combine both: use EF for most of the system, and have a few well-tuned stored procedures for data-heavy reports or batch updates.

Let me know if you want to see a hybrid example or best practices for stored procs in a modern stack.

@DocGreenRob
Copy link
Author

Perfect. Since you've already wired up the messaging system and are preparing customers for an upcoming outage (even if the date isn’t final), here are several daily rotating messages you can use. These are written to sound professional, calm, and transparent — enough to build trust without panic:


DAY 1 — Awareness Message (Use Today)

Heads up!
We're making some important upgrades behind the scenes that may cause temporary downtime in the coming days. No action needed now — just keeping you in the loop. More updates soon!


DAY 2 — Soft Warning

Coming Soon:
A short service outage may be required as we improve our systems. We're working to minimize the impact and will notify you before any changes take place.


DAY 3 — Early Countdown Starts (5 Days Out)

Service Notice: 5 Days Out
Our platform will undergo scheduled maintenance in the next few days. You may experience brief interruptions. We appreciate your patience as we prepare for a smoother, faster experience.


DAY 4 — 4 Days Out

Service Update: 4 Days Away
We’re finalizing infrastructure changes that may cause a short outage soon. We’ll keep you informed and support you through every step.


DAY 5 — 3 Days Out

Planned Maintenance in 3 Days
In preparation for system upgrades, a brief outage may occur soon. You’ll be notified again 24 hours before it happens.


DAY 6 — 2 Days Out

Just 2 Days Left
We’re getting ready for a scheduled outage to upgrade our system. Expect temporary unavailability. Thank you for being part of this journey.


DAY 7 — 1 Day Out

24-Hour Notice
Our system will go offline temporarily starting tomorrow as we migrate to a new platform. We’ll notify you once we’re back online. Thanks for your patience and support!


During Outage Message

We’re Currently Under Maintenance
Our platform is temporarily offline as we complete essential system upgrades. We’ll notify you the moment we’re back. We appreciate your understanding!


Post-Outage Message

We’re Back Online!
Thank you for your patience during our upgrade. Our system is now live and ready to serve you even better.


Let me know if you want a JSON version of these messages for your API or a quick API toggle script for environment configs.

@DocGreenRob
Copy link
Author

editorconfig

NOTE: Requires VS2019 16.3 or later

CPS Editor Config Version 1.0

Microsoft Managed Recommended Rules

Description: These rules focus on the most critical problems in your code, including potential security holes, application crashes, and other important logic and design errors. It is recommended to include this rule set in any custom rule set you create for your projects.

Code files

[*.{cs,vb}]

dotnet_diagnostic.CA1001.severity = warning

dotnet_diagnostic.CA1009.severity = warning

dotnet_diagnostic.CA1016.severity = warning

dotnet_diagnostic.CA1033.severity = warning

dotnet_diagnostic.CA1049.severity = warning

dotnet_diagnostic.CA1060.severity = warning

dotnet_diagnostic.CA1061.severity = warning

dotnet_diagnostic.CA1063.severity = warning

dotnet_diagnostic.CA1065.severity = warning

dotnet_diagnostic.CA1301.severity = warning

dotnet_diagnostic.CA1303.severity = none

dotnet_diagnostic.CA1400.severity = warning

dotnet_diagnostic.CA1401.severity = warning

dotnet_diagnostic.CA1403.severity = warning

dotnet_diagnostic.CA1404.severity = warning

dotnet_diagnostic.CA1405.severity = warning

dotnet_diagnostic.CA1410.severity = warning

dotnet_diagnostic.CA1415.severity = warning

dotnet_diagnostic.CA1821.severity = warning

dotnet_diagnostic.CA1900.severity = warning

dotnet_diagnostic.CA1901.severity = warning

dotnet_diagnostic.CA2002.severity = warning

dotnet_diagnostic.CA2100.severity = warning

dotnet_diagnostic.CA2101.severity = warning

dotnet_diagnostic.CA2108.severity = warning

dotnet_diagnostic.CA2111.severity = warning

dotnet_diagnostic.CA2112.severity = warning

dotnet_diagnostic.CA2114.severity = warning

dotnet_diagnostic.CA2116.severity = warning

dotnet_diagnostic.CA2117.severity = warning

dotnet_diagnostic.CA2122.severity = warning

dotnet_diagnostic.CA2123.severity = warning

dotnet_diagnostic.CA2124.severity = warning

dotnet_diagnostic.CA2126.severity = warning

dotnet_diagnostic.CA2131.severity = warning

dotnet_diagnostic.CA2132.severity = warning

dotnet_diagnostic.CA2133.severity = warning

dotnet_diagnostic.CA2134.severity = warning

dotnet_diagnostic.CA2137.severity = warning

dotnet_diagnostic.CA2138.severity = warning

dotnet_diagnostic.CA2140.severity = warning

dotnet_diagnostic.CA2141.severity = warning

dotnet_diagnostic.CA2146.severity = warning

dotnet_diagnostic.CA2147.severity = warning

dotnet_diagnostic.CA2149.severity = warning

dotnet_diagnostic.CA2200.severity = warning

dotnet_diagnostic.CA2202.severity = warning

dotnet_diagnostic.CA2207.severity = warning

dotnet_diagnostic.CA2212.severity = warning

dotnet_diagnostic.CA2213.severity = warning

dotnet_diagnostic.CA2214.severity = warning

dotnet_diagnostic.CA2216.severity = warning

dotnet_diagnostic.CA2220.severity = warning

dotnet_diagnostic.CA2229.severity = warning

dotnet_diagnostic.CA2231.severity = warning

dotnet_diagnostic.CA2232.severity = warning

dotnet_diagnostic.CA2235.severity = warning

dotnet_diagnostic.CA2236.severity = warning

dotnet_diagnostic.CA2237.severity = warning

dotnet_diagnostic.CA2238.severity = warning

dotnet_diagnostic.CA2240.severity = warning

dotnet_diagnostic.CA2241.severity = warning

dotnet_diagnostic.CA2242.severity = warning

dotnet_diagnostic.SA1101.severity = none

Fix formating

#Access modifier required
dotnet_diagnostic.IDE0040.severity = warning
#Name can be simplified (Ex:bool)
dotnet_diagnostic.IDE0049.severity = warning
#Remove unnecessary import
dotnet_diagnostic.IDE0005.severity = warning

Fix formating

dotnet_diagnostic.IDE0055.severity = warning

Element should be start with uppercase letter

dotnet_diagnostic.IDE1006.severity = warning
#Element should be documented(Missing XML comment)
dotnet_diagnostic.CS1591.severity = warning

IDE0065: Misplaced using directive

dotnet_diagnostic.IDE0065.severity = warning

IDE2000: Multiple space Issue

dotnet_diagnostic.IDE2000.severity = error

[*.cs]

SA1518: Use line endings correctly at end of file - not working

dotnet_diagnostic.SA1518.severity = warning

SA1511: While-do footer should not be preceded by blank line - not working

dotnet_diagnostic.SA1511.severity = warning

SA1518: Use line endings correctly at end of file - not working

dotnet_diagnostic.SA1518.severity = warning

SA1519: Braces should not be omitted from multi-line child statement - not working

dotnet_diagnostic.SA1519.severity = warning

#-not working in this but working in other property
dotnet_diagnostic.SA1135.severity = none

dotnet_diagnostic.SA1413.severity = none

[*.{cs,vb}]

File header

file_header_template = -----------------------------------------------------------------------\n \nCopyright (c) Comprehensive Pharmacy Services. All rights reserved.\n\n-------------------------------------------------------------------
dotnet_diagnostic.IDE0073.severity = warning

Naming styles

Locals and parameters are camelCase

dotnet_naming_rule.locals_should_be_camel_case.severity = warning
dotnet_naming_rule.locals_should_be_camel_case.symbols = locals_and_parameters
dotnet_naming_rule.locals_should_be_camel_case.style = camel_case_style
dotnet_naming_symbols.locals_and_parameters.applicable_kinds = parameter, local
dotnet_naming_style.camel_case_style.capitalization = camel_case

Naming rules

dotnet_naming_rule.interface_should_be_begins_with_i.severity = suggestion
dotnet_naming_rule.interface_should_be_begins_with_i.symbols = interface
dotnet_naming_rule.interface_should_be_begins_with_i.style = begins_with_i

dotnet_naming_rule.types_should_be_pascal_case.severity = warning
dotnet_naming_rule.types_should_be_pascal_case.symbols = types
dotnet_naming_rule.types_should_be_pascal_case.style = pascal_case

dotnet_naming_rule.non_field_members_should_be_pascal_case.severity = warning
dotnet_naming_rule.non_field_members_should_be_pascal_case.symbols = non_field_members
dotnet_naming_rule.non_field_members_should_be_pascal_case.style = pascal_case

Symbol specifications

dotnet_naming_symbols.interface.applicable_kinds = interface
dotnet_naming_symbols.interface.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.interface.required_modifiers =

dotnet_naming_symbols.types.applicable_kinds = class, struct, interface, enum
dotnet_naming_symbols.types.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.types.required_modifiers =

dotnet_naming_symbols.non_field_members.applicable_kinds = property, event, method
dotnet_naming_symbols.non_field_members.applicable_accessibilities = public, internal, private, protected, protected_internal, private_protected
dotnet_naming_symbols.non_field_members.required_modifiers =

Naming styles

dotnet_naming_style.begins_with_i.required_prefix = I
dotnet_naming_style.begins_with_i.required_suffix =
dotnet_naming_style.begins_with_i.word_separator =
dotnet_naming_style.begins_with_i.capitalization = pascal_case

dotnet_naming_style.pascal_case.required_prefix =
dotnet_naming_style.pascal_case.required_suffix =
dotnet_naming_style.pascal_case.word_separator =
dotnet_naming_style.pascal_case.capitalization = pascal_case

dotnet_naming_style.pascal_case.required_prefix =
dotnet_naming_style.pascal_case.required_suffix =
dotnet_naming_style.pascal_case.word_separator =
dotnet_naming_style.pascal_case.capitalization = pascal_case
dotnet_style_operator_placement_when_wrapping = beginning_of_line
tab_width = 4
indent_size = 4
end_of_line = crlf
dotnet_style_coalesce_expression = true:suggestion
dotnet_style_null_propagation = true:suggestion
dotnet_style_prefer_is_null_check_over_reference_equality_method = true:suggestion
dotnet_style_prefer_auto_properties = true:silent
dotnet_style_allow_multiple_blank_lines_experimental = false:error
dotnet_style_predefined_type_for_locals_parameters_members = true:silent
dotnet_style_prefer_inferred_tuple_names = true:suggestion
dotnet_style_prefer_inferred_anonymous_type_member_names = true:suggestion
dotnet_style_namespace_match_folder = true:warning
dotnet_style_object_initializer = true:suggestion
dotnet_style_collection_initializer = true:suggestion
dotnet_style_prefer_simplified_boolean_expressions = true:suggestion
dotnet_style_prefer_conditional_expression_over_assignment = true:silent
dotnet_style_prefer_conditional_expression_over_return = true:silent
dotnet_style_explicit_tuple_names = true:suggestion
dotnet_style_allow_statement_immediately_after_block_experimental = true:silent
dotnet_style_prefer_compound_assignment = true:suggestion
dotnet_style_prefer_simplified_interpolation = true:suggestion
dotnet_style_readonly_field = true:suggestion
dotnet_style_predefined_type_for_member_access = true:silent
dotnet_style_require_accessibility_modifiers = for_non_interface_members:silent
dotnet_code_quality_unused_parameters = all:suggestion

[*.cs]
csharp_prefer_braces = true:warning
csharp_indent_labels = one_less_than_current
csharp_using_directive_placement = inside_namespace:warning
csharp_prefer_simple_using_statement = true:warning
csharp_prefer_braces = true:warning
csharp_style_namespace_declarations = block_scoped:silent
csharp_style_prefer_method_group_conversion = true:silent
csharp_style_allow_blank_lines_between_consecutive_braces_experimental = true:silent
csharp_new_line_before_else = true
csharp_style_unused_value_expression_statement_preference = discard_variable:silent
csharp_prefer_static_local_function = true:suggestion
csharp_style_allow_embedded_statements_on_same_line_experimental = true:silent
csharp_style_prefer_top_level_statements = true:silent
csharp_style_expression_bodied_methods = false:silent
csharp_style_expression_bodied_constructors = false:silent
csharp_style_expression_bodied_operators = false:silent
csharp_style_expression_bodied_properties = true:silent
csharp_style_expression_bodied_indexers = true:silent
csharp_style_expression_bodied_accessors = true:silent
csharp_style_expression_bodied_lambdas = true:silent
csharp_style_expression_bodied_local_functions = false:silent
csharp_space_around_binary_operators = before_and_after
csharp_style_prefer_local_over_anonymous_function = true:suggestion
csharp_prefer_static_local_function = true:suggestion
csharp_style_conditional_delegate_call = true:suggestion
csharp_style_pattern_matching_over_is_with_cast_check = true:suggestion
csharp_style_throw_expression = true:suggestion
csharp_style_prefer_null_check_over_type_check = true:suggestion
csharp_prefer_simple_default_expression = true:suggestion
csharp_style_prefer_index_operator = true:suggestion
csharp_style_prefer_range_operator = true:suggestion
csharp_style_implicit_object_creation_when_type_is_apparent = true:suggestion
csharp_style_prefer_tuple_swap = true:suggestion
csharp_style_prefer_utf8_string_literals = true:suggestion
csharp_style_inlined_variable_declaration = true:suggestion
csharp_style_deconstructed_variable_declaration = true:suggestion
csharp_style_unused_value_assignment_preference = discard_variable:suggestion
csharp_style_allow_blank_line_after_colon_in_constructor_initializer_experimental = true:silent

@DocGreenRob
Copy link
Author

What you're seeing is likely a combination of Roslyn analyzers + editor config + stylecop settings, and possibly code analysis rules configured to treat warnings as errors in that project.

Here’s how to trace and port that setup:


✅ 1. Look for .editorconfig in the project root

This file governs most style and formatting rules now (replacing ruleset files in many modern .NET projects).

Open it and look for things like:

# Enforce XML documentation
dotnet_diagnostic.CS1591.severity = error

# Enforce braces for if/else
csharp_prefer_braces = true:error

# No extra blank lines
dotnet_diagnostic.SA1507.severity = error

👉 Copy this file into the root of your personal project(s) to carry over the rules.


✅ 2. Check for Roslyn Analyzers in .csproj

In your .csproj, check if there’s a reference to something like:

<PackageReference Include="Microsoft.CodeAnalysis.NetAnalyzers" Version="x.x.x" />
<PackageReference Include="StyleCop.Analyzers" Version="x.x.x" />

These are NuGet packages enforcing coding rules. If present, copy them to your personal .csproj.

You might also see:

<AnalysisLevel>latest</AnalysisLevel>
<WarningsAsErrors>CS1591;SAxxxx;IDExxxx</WarningsAsErrors>

This line converts specific warnings into compile-breaking errors. Copy this too.


✅ 3. Check for ruleset files (older projects)

Some older projects still use .ruleset files:

  • Open the .csproj or right-click the project → PropertiesCode Analysis
  • Look for a .ruleset file path.

If you see it, copy that file and the reference in the .csproj.


✅ 4. Copy recommended StyleCop settings (if missing)

If you’re missing StyleCop, you can install and configure it like this:

dotnet add package StyleCop.Analyzers

Then add or adjust .editorconfig like:

# StyleCop rules
dotnet_diagnostic.SA0001.severity = error
dotnet_diagnostic.SA1101.severity = error
dotnet_diagnostic.SA1500.severity = error
dotnet_diagnostic.SA1600.severity = error  # XML doc required

✅ 5. Optional: Use Directory.Build.props to share across projects

If you want all your projects to inherit these rules, create a Directory.Build.props file at a higher folder level with shared package references and settings.


✅ Summary: What to Copy

  1. .editorconfig
  2. .ruleset (if applicable)
  3. NuGet packages for StyleCop.Analyzers or Microsoft.CodeAnalysis.*
  4. <WarningsAsErrors> or <AnalysisLevel> in .csproj
  5. Any Directory.Build.props overrides

Let me know if you want me to generate a clean baseline .editorconfig and .csproj with strict rules for your own projects. I can tailor it exactly like the one your customer uses.

@DocGreenRob
Copy link
Author

DocGreenRob commented Jul 26, 2025

[HttpPatch("api/blandai/callflow/{id}")]
[ProducesResponseType(typeof(Task<IActionResult>), StatusCodes.Status200OK)]
[ProducesResponseType(typeof(NotFoundResult), StatusCodes.Status404NotFound)]
[ProducesResponseType(typeof(UnauthorizedResult), StatusCodes.Status401Unauthorized)]
public async Task<IActionResult> UpdateMaintenanceTimeAsync(string id, JsonPatchDocument<PatchCallFlowDto> jsonPatchDocument)
{
	// Before processing the Patch, validate by FluentValidation
	//var userDto = await _userService.GetByIdAsync(id).ConfigureAwait(false);
	//var roleEntity = _mapper.Map<User>(userDto);
	//jsonPatchDocument.Map<PatchUser, User>().ApplyTo(roleEntity);
	//userDto = _mapper.Map<UserDto>(roleEntity);

	//// TODO: Ensure other controllers' PatchAsync validates the Dto prior to sending to Service
	//var validationResult = new UserValidator().Validate(userDto);

	//if (!validationResult.IsValid)
	//{
	//	return BadRequest(validationResult.Errors.Select(x => x.ErrorMessage));
	//}

	var result = await _blandAiManager.PatchCallFlowAsync(id, jsonPatchDocument);

	var recordPatchedMessage = new RecordPatchedMessageDto<CallFlowDto>
	{
		OldRecord = result.Item1,
		NewRecord = result.Item2
	};

	var userProperties = new Dictionary<string, string>
	{
		{ "CallFlowDto", id }
	};

	TelemetryClient.TrackEvent($"RecordPatch", userProperties);

	return Ok(recordPatchedMessage);
}
public class PatchCallFlowDto
{
	public bool? IsNextStepOk { get; set; }
	public bool? IsNextStepComplete { get; set; }
}
public async Task<Tuple<CallFlowDto, CallFlowDto>> PatchCallFlowAsync(string id, JsonPatchDocument<PatchCallFlowDto> jsonPatchDocument)
{
	if (!long.TryParse(id, out long callFlowId))
	{
		throw new ArgumentException("Invalid time log id");
	}

	var blandAiCallFlowEntities = await _unitOfWork.BlandAiCallFlowRepository.GetAsync(x => x.Id == callFlowId);

	if (!blandAiCallFlowEntities.Any() || blandAiCallFlowEntities.Count() > 1)
	{
		throw new ArgumentOutOfRangeException(nameof(id));
	}

	var blandAiCallFlowEntity = blandAiCallFlowEntities.First();

	var originalTimeLogDto = _mapper.Map<CallFlowDto>(blandAiCallFlowEntity);

	jsonPatchDocument.Map<PatchCallFlowDto, BlandAiCallFlow>().ApplyTo(blandAiCallFlowEntity);

	blandAiCallFlowEntity.UpdatedByUser = Constants.UserEmail;
	blandAiCallFlowEntity.UpdatedDate = DateTime.UtcNow;

	Expression<Func<Project, bool>> expression = x => x.Id == blandAiCallFlowEntity.Id;

	await _unitOfWork.BlandAiCallFlowRepository.UpdateAsync(blandAiCallFlowEntity).ConfigureAwait(false);
	await _unitOfWork.BlandAiCallFlowRepository.SaveAsync().ConfigureAwait(false);

	return new Tuple<CallFlowDto, CallFlowDto>(originalTimeLogDto, _mapper.Map<CallFlowDto>(blandAiCallFlowEntity));
}
CreateMap<BlandAiCallFlow, Common.Dto.External.BlandAi.CallFlowDto>().ReverseMap();
public class RecordPatchedMessageDto<T>
{
	public T OldRecord { get; set; }
	public T NewRecord { get; set; }
	public string TypeName { get; set; }
}
public static class JsonPathDocumentExtensions
{
	public const string InvalidPatchDocumentCode = "InvalidPatchDocument";
	public const string InvalidPatchDocumentProperty = "JsonPatchDocument";
	private static readonly ConcurrentDictionary<Type, ConstructorInfo> _defaultConstructors = new ConcurrentDictionary<Type, ConstructorInfo>();
	private static readonly ConcurrentDictionary<Type, IValidator> _validators = new ConcurrentDictionary<Type, IValidator>();


	public static JsonPatchDocument<TOut> Map<TIn, TOut>(this JsonPatchDocument<TIn> instance)
		where TIn : class, new()
		where TOut : class, new()
	{
		return new JsonPatchDocument<TOut>(instance.Operations.Select(x => x.Map<TIn, TOut>()).ToList(), instance.ContractResolver);
	}

	public static Operation<TOut> Map<TIn, TOut>(this Operation<TIn> instance)
		where TIn : class, new()
		where TOut : class, new()
	{
		return new Operation<TOut>(instance.op, instance.path, instance.from, instance.value);
	}
}
    {
        "op": "replace",
        "path": "/OwnerBudget",
        "value": 350000
    }
]

@DocGreenRob
Copy link
Author

You're getting this error in Dev but not Local:

"The instance of entity type 'AssessmentItem' cannot be tracked because another instance with the same key value for {'AssessmentItemId'} is already being tracked."

This is an Entity Framework Core error that occurs when two separate instances of an entity with the same primary key are being tracked in the same DbContext. Let’s break it down:


✅ Why It Happens

EF Core enforces a one tracked instance per primary key rule. If you:

  • Load an entity (e.g., from a query), then
  • Later Attach or Insert another entity with the same primary key, EF will throw this error.

📍 Most Likely Cause in Your Code

You're doing:

var serializedCurrentAssessmentItemEntity = JsonConvert.SerializeObject(assessmentItemEntity);
var newAssessmentItemEntity = JsonConvert.DeserializeObject<AssessmentItem>(serializedCurrentAssessmentItemEntity);
...
newAssessmentItemEntity.AssessmentItemId = 0; // so it becomes a "new" insert

BUT LATER, this logic is violated here:

var asdf = await unitOfWork.AssessmentItemRepository.FindByAsync(...);

// If asdf is not null, you do:
await unitOfWork.AssessmentItemRepository.UpdateAsync(asdf);

If asdf is already being tracked AND you’re also tracking a new instance with the same AssessmentItemId, this will trigger the error.

And worse: your object graphs (sections, assessmentItems) might carry references to previously tracked entities from deserialization.


✅ Fix (3 Options)

Option 1: Use .AsNoTracking()

Anywhere you're querying for something that you don’t plan to update, make sure to use .AsNoTracking() to avoid the DbContext tracking that entity.

Update your repository to support it:

// In your repository FindByAsync or FindAllByAsync:
var query = dbContext.Set<T>().AsQueryable();
if (asNoTracking)
    query = query.AsNoTracking();

Use it like:

var currentAssessmentEntity = await unitOfWork.AssessmentRepository
    .FindByAsync(x => ..., cancellationToken, asNoTracking: true);

Option 2: Use Detach() before reusing the entity

Before reinserting deserialized copies or updating found ones, you can manually detach tracked entities like:

context.Entry(existingEntity).State = EntityState.Detached;

Or detach the whole context (not usually recommended).


Option 3: Use a clean DbContext scope

Instead of reusing a long-living DbContext that accumulates tracked state, do:

using (var scope = new TransactionScope(...))
{
    using (var newContext = new MyDbContext())
    {
        // work with clean context
    }
}

You can also do it with IServiceScopeFactory if using DI.


🕵️ Why This Works in Local But Not in Dev?

Because your Dev likely uses different EF configuration, possibly:

  • SensitiveDataLogging is off (you should enable it in Dev temporarily)
  • Different lifetime for DbContext (Scoped vs Singleton)
  • Different records already in Dev that aren’t in local, triggering entity tracking conflict when resolving duplicates

✅ Suggested Next Steps

  1. Enable this for clearer logs in Dev:
optionsBuilder.EnableSensitiveDataLogging();
  1. Update all your repo FindBy and FindAllBy to support AsNoTracking.

  2. During cloning, ensure you're only attaching new objects (with Id = 0) or objects that EF is not already tracking.


Want me to give you a diff of which repo queries to apply AsNoTracking() to based on the code above?

@DocGreenRob
Copy link
Author

Podcasts

Syntax 7/21/25

AI Tools
VS Code --> Windsurf --> Cursor

CLI Agents

  • Claude Code

@DocGreenRob
Copy link
Author

DocGreenRob commented Aug 5, 2025

Setup...

1. Docker Setup

Download and install Docker Desktop - link

  • if forced to restart, launch Docker Desktop
  • if you get an error about Windows Subsystem for Linux, in Powershell (as Admin) run
    wsl --update

2. SonarQube Setup

docker run -d --name sonarqube -p 9000:9000 -e SONAR_ES_BOOTSTRAP_CHECKS_DISABLE=true sonarqube:latest

!! If you get a port error

docker: Error response from daemon: ports are not available: exposing port TCP 0.0.0.0:9000 -> 127.0.0.1:0: listen tcp 0.0.0.0:9000: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.

then you can either:

  1. Kill the port:
    a. Find out what's running on port 9000:
    netstat -aon | findstr :9000

    then you will see:

TCP    0.0.0.0:9000           0.0.0.0:0              LISTENING       13388
TCP    127.0.0.1:9000         0.0.0.0:0              LISTENING       13388
TCP    127.0.0.1:9000         127.0.0.1:50528        TIME_WAIT       0
TCP    127.0.0.1:9000         127.0.0.1:64865        ESTABLISHED     13388
TCP    127.0.0.1:50530        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:50531        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:50533        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:50541        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:50554        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:50566        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:50579        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:63479        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:63484        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:63509        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:63510        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:63517        127.0.0.1:9000         TIME_WAIT       0
TCP    127.0.0.1:64865        127.0.0.1:9000         ESTABLISHED     14204
TCP    172.17.1.214:9000      20.37.135.94:50549     ESTABLISHED     13388
TCP    172.17.1.214:9000      20.59.87.225:64896     ESTABLISHED     13388
TCP    172.17.1.214:9000      20.59.87.226:57486     ESTABLISHED     13388
TCP    172.17.1.214:9000      20.169.174.231:63252   ESTABLISHED     13388
TCP    172.17.1.214:9000      23.33.29.208:63447     TIME_WAIT       0
TCP    172.17.1.214:9000      23.193.200.119:63254   ESTABLISHED     13388
TCP    172.17.1.214:9000      40.82.248.226:50571    ESTABLISHED     13388
TCP    172.17.1.214:9000      40.82.248.226:50575    ESTABLISHED     13388
TCP    172.17.1.214:9000      40.82.248.226:63425    TIME_WAIT       0
TCP    172.17.1.214:9000      40.126.27.66:63368     FIN_WAIT_2      13388
TCP    172.17.1.214:9000      52.96.16.162:63357     ESTABLISHED     13388
TCP    172.17.1.214:9000      52.96.119.194:61662    ESTABLISHED     13388
TCP    172.17.1.214:9000      52.109.8.89:50559      ESTABLISHED     13388
TCP    172.17.1.214:9000      52.109.16.87:62244     ESTABLISHED     13388
TCP    172.17.1.214:9000      52.152.143.207:63434   ESTABLISHED     13388
TCP    172.17.1.214:9000      52.152.143.207:63501   ESTABLISHED     13388
TCP    172.17.1.214:9000      52.179.73.37:63334     ESTABLISHED     13388
TCP    172.17.1.214:9000      54.149.170.228:64213   ESTABLISHED     13388
TCP    [::]:9000              [::]:0                 LISTENING       13388
UDP    0.0.0.0:9000           *:*                                    13388
UDP    [::]:9000              *:*                                    13388

The Process ID is the last column (i.e., 13388)

b. Then you can kill it by taskkill /PID 13388 /F

or...

  1. you can use another port (which is what I'm doing)
    docker run -d --name sonarqube -p 9001:9000 -e SONAR_ES_BOOTSTRAP_CHECKS_DISABLE=true sonarqube:latest

! If you get this error:

docker: Error response from daemon: Conflict. The container name "/sonarqube" is already in use by container "656085e89c234748d11cbdd41aaa2ece8ad3aa44f809702660eb4db88f271602". You have to remove (or rename) that container to be able to reuse that name.

Now run docker rm -f sonarqube to remove it

Then run:
docker run -d --name sonarqube -p 9001:9000 -e SONAR_ES_BOOTSTRAP_CHECKS_DISABLE=true sonarqube:latest

This spins up SonarQube on http://localhost:9000.

3. Get your SonarQube Token

  1. Go to: http://localhost:9000/
  2. Default credentials:
    Username: admin
    Password: admin
  3. Click on "Account" icon in upper right
  4. Click "My Account"
  5. Select the "Security" tab at the top
  6. Proceed to "Generate Token" as per your needs. (Set the Token Type to "User" from the Dropdown list)

4. SonarScanner

  1. Download 👉 https://docs.sonarsource.com/sonarqube/latest/analyzing-source-code/scanners/sonarscanner/
  2. Extract & Add to PATH
  • Unzip the contents (e.g., to C:\SonarScanner)
  • Then add the bin folder to your PATH environment variable:
    • Press ⊞ Win → search "Environment Variables"
    • Click “Edit the system environment variables”
    • Click “Environment Variables”
    • Under System Variables → select Path → click Edit
    • Add: C:\SonarScanner\bin (or wherever you extracted)
  1. Test the Install
  • Open a new terminal (so it reloads the PATH) and run:
    sonar-scanner -v
image

5. Install support for .NET

Temporarily disable CPS Nuget feed to avoid the 401 error in the next step

  1. dotnet nuget list source
  2. You should see a response like:
Registered Sources:
  1.  nuget.org [Enabled]
      https://api.nuget.org/v3/index.json
  2.  CPS [Enabled]
      https://pkgs.dev.azure.com/ComprehensivePharmacyServices/_packaging/ComprehensivePharmacyServices%40Local/nuget/v3/index.json
  3.  Microsoft Visual Studio Offline Packages [Enabled]
      C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\
  1. dotnet nuget disable source "CPS"
  2. dotnet tool install --global dotnet-sonarscanner
  3. dotnet nuget enable source "CPS"

verify it works:

dotnet sonarscanner --version

image

6. PAT for NuGet feed access

  1. Go to Projects - Home
image

7. Support for internal NuGet feeds

dotnet nuget update source CPS --source https://pkgs.dev.azure.com/ComprehensivePharmacyServices/_packaging/ComprehensivePharmacyServices%40Local/nuget/v3/index.json  --username [email protected] --password PERSONAL_ACCESS_TOKEN__PAT --store-password-in-clear-text

** If needed, this will show all your nuget packages in the solution/project dotnet nuget list source

8. Install coverlet.collector

  1. C:\...PATH_TO_TEST_PROJECT> dotnet add package coverlet.collector --source https://api.nuget.org/v3/index.json

  2. C:\...PATH_TO_TEST_PROJECT> dotnet add package coverlet.msbuild --source https://api.nuget.org/v3/index.json

  3. C:\...PATH_TO_TEST_PROJECT> dotnet clean

#Running...

Setup Code Coverage analysis

  1. dotnet nuget disable source "CPS"

  2. dotnet tool install --global dotnet-sonarscanner

  3. C:\...PATH_TO_TEST_PROJECT> dotnet tool install --global dotnet-reportgenerator-globaltool

  4. dotnet nuget enable source "CPS"

  5. In VisualStudio rebuild your solution

  6. C:\...PATH_TO_SOLUTION> dotnet test CPS.RxAssess.Business.Tests/CPS.RxAssess.Business.Tests.csproj --no-build /p:CollectCoverage=true /p:CoverletOutput=./TestResults/coverage.opencover.xml /p:CoverletOutputFormat=opencover

image

Running a scan locally...

  1. C:\...PATH_TO_SOLUTION> dotnet sonarscanner begin /k:"AccountingAssistant2" /d:sonar.host.url="http://localhost:9000" /d:sonar.token="SONARQUBE_TOKEN" /d:sonar.cs.opencover.reportsPaths="AccountingAssistant.Api.Tests/TestResults/coverage.opencover.xml"

  2. C:\...PATH_TO_SOLUTION> dotnet build CPS.RxAssess.sln

  3. C:\...PATH_TO_SOLUTION> dotnet sonarscanner end /d:sonar.token="SONARQUBE_TOKEN"

See results

  1. Go to http://localhost:9000/dashboard?id=AccountingAssistant2

@DocGreenRob
Copy link
Author

Supercharger:

tt4e2HN4X3gO09PJIuXK5ZOviuyQORn77YtQ3fsLyPAcScMg3qFGj+8KgLLQr0WWggKFxnyEAezbDaT6Uiyb4N3WzHvKoMl5S24i/eQCCCYQdCeroyqE12g3h7ro3v8sCwKOA10kfQy

@DocGreenRob
Copy link
Author

The first line should be a single line summary with no more than 50 characters.

The second line should be blank.

Start the full summary on the third line. Ignore whitespace changes in the summary. Use bullet points for each line in the summary when describing changes.

@DocGreenRob
Copy link
Author

image

@DocGreenRob
Copy link
Author

@DocGreenRob
Copy link
Author

import { AfterViewInit, Component, Input, OnInit } from '@angular/core';
import { CommunicationService } from '../../../services/_internal/communication/communication.service';
import { SessionDataService } from '../../../services/_internal/session-data/session-data.service';
import { CommonModule } from '@angular/common';

@Component({
  selector: 'app-global-spinner',
  standalone: true,
  imports: [CommonModule],
  templateUrl: './global-spinner.component.html',
  styleUrl: './global-spinner.component.scss'
})
export class GlobalSpinnerComponent implements OnInit, AfterViewInit {
  // *********************
  // variables declaration
  // *********************

  // private
  // ********

  // public
  // ******
  @Input() spinnerText: string = "Searching...";

  constructor(private sessionDataService: SessionDataService,
    private communicationService: CommunicationService) {
  }

  // ***************
  // lifecycle hooks
  // ***************
  ngOnInit(): void {
    this.housekeeping();
  }

  ngAfterViewInit(): void {
  }

  // *******
  // methods
  // *******

  // public
  // ******

  // private
  // *******
  private housekeeping() {
  }

}

@DocGreenRob
Copy link
Author

image image image image image

@DocGreenRob
Copy link
Author

RxProcessManagerLocalTest

Set WshShell = CreateObject("WScript.Shell")
Do
WshShell.SendKeys "{SCROLLLOCK}"
WScript.Sleep 60000 ' 60 seconds
Loop

cscript //nologo RxProcessManagerLocalTest.vbs

@DocGreenRob
Copy link
Author

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment