Azure – Application Settings and Connection Strings

Resources for storing connection string and application string in azure apps.


#azure #azure-database

Azure SQL Database (MSSQL) -…

Azure SQL Database (MSSQL) – Working with JSON Data


SELECT Name, Surname,
  JSON_VALUE(jsonCol, '$.info.address.PostCode') AS PostCode,
  JSON_VALUE(jsonCol, '$.info.address."Address Line 1"') + ' '
  + JSON_VALUE(jsonCol, '$.info.address."Address Line 2"') AS Address,
  JSON_QUERY(jsonCol, '$.info.skills') AS Skills
FROM People
WHERE ISJSON(jsonCol) > 0
  AND JSON_VALUE(jsonCol, '$.info.address.Town') = 'Belgrade'
  AND Status = 'Active'
ORDER BY JSON_VALUE(jsonCol, '$.info.address.PostCode')


SET @json = '{"info": {"address": [{"town": "Belgrade"}, {"town": "Paris"}, {"town":"Madrid"}]}}';
SET @json = JSON_MODIFY(@json, '$.info.address[1].town', 'London');
SELECT modifiedJson = @json;

Convert JSON collections to a rowset

SET @json = N'[
  {"id": 2, "info": {"name": "John", "surname": "Smith"}, "age": 25},
  {"id": 5, "info": {"name": "Jane", "surname": "Smith"}, "dob": "2005-11-04T12:00:00"}

  WITH (
    id INT 'strict $.id',
    firstName NVARCHAR(50) '$',
    lastName NVARCHAR(50) '$.info.surname',
    age INT,
    dateOfBirth DATETIME2 '$.dob'

Resources & more:


MySQL from version 5.7 has JSON support:

#azure #azure-database #json #mssql-json

Visual programming with hardware and Node-RED

A good podcast about node-red from the creators.

JS Party 95: Visual programming with hardware and Node-RED – Listen on

BigQuery with .NET Core

Here is a sample respository ready to be injected to a ASP.NET Core application.

   public class SelfViewRepository : ISelfViewRepository
        private readonly string _projectId;
        private readonly GoogleCredential _gcpCredential;

        public SelfViewRepository(string projectId, string credentialFile)
            _projectId = projectId;
            _gcpCredential = GoogleCredential.FromFile(credentialFile);

        public async Task<BigQueryResults> GetData(string query)
            BigQueryClient client = BigQueryClient.Create(_projectId, _gcpCredential);

            BigQueryJob job = client.CreateQueryJob(
                sql: query,
                parameters: null,
                // options: new QueryOptions {UseQueryCache = false});
                options: new QueryOptions { });
            var data = await job.GetQueryResultsAsync();
            // Wait for the job to complete.
            // Display the results
            return data;

#googlecloud #bigquery #aspnetcore

JSON Type with MySQL & EF Core

As of MySQL 5.7.8, it has support for Json type. Setting the column type to json would do the job.

  [Column(TypeName = "json")]
  public string Settings{ get; set; }

Or with fluent api:

       modelBuilder.Entity<Blog>(eb =>
            eb.Property(b => b.Settings).HasColumnType("json");

If you send an invalid JSON, MySQL will throw an error of “Invalid JSON text”.
Details on usage are in the official documentation:

In the case of custom objects, Pomelo has a feature as well:

A nive tutorial from

#json #mysql #mysqlJson #efcore

Google translate in google sheets

Gif from:

#GoogleSheets #GoogleTranslate

gcloud basics cheat sheet

GCloud Cheat Sheet:

App Deploy

  • gcloud app deploy ~/my_app/app.yaml

List Versions

  • gcloud app versions list
  • JSON: gcloud app versions list --format json

List services

Delete services:

gcloud app services delete

List versions

  • “`gcloud app versions list“
  • Powershellgcloud app versions list --format json | ConvertFrom-Json

Delete Versions

  • gcloud app versions delete

Read logs / tail

  • Read: gcloud app logs read
  • Tail: gcloud app logs tail
  • Tail: gcloud app logs read --limit 10 --service=default
  • Tail: gcloud app logs read --version=v1

Deploy google function with node

gcloud functions deploy pplusFunction --runtime nodejs8 --trigger-http

Split Traffic

To send all traffic to ‘v2’ of service ‘s1’, run:
gcloud app services set-traffic s1 –splits v2=1
To split traffic evenly between ‘v1’ and ‘v2’ of service ‘s1’, run:
gcloud app services set-traffic s1 –splits v2=.5,v1=.5

Google Cloud Build Trigger: ASP.NET Core and Angular

# run npm install for Angular
- name: ''
  args: ['install']
  dir: 'MyProject.Web/ClientApp'
# build Angular for production
- name: ''
  args: ['run', 'build','--','--prod']
  dir: 'MyProject.Web/ClientApp'
# publish core solution  
- name: microsoft/dotnet:2.2-sdk
  args: ['dotnet', 'publish','-c','Release']
# deploy the webapi to the AppEngine
- name:
  args: ['app', 'deploy', './MyProject.Web/bin/Release/netcoreapp2.2/publish/app.yaml','--version','staging']
timeout: 660s

Other examples:

#GoogleCloud #gcp #BuildTrigger

Compiling vs Transpiling

A question that I heard recently: Compiling vs Transpiling 

It seems that Wikipedia has enough information as well: “A source-to-source compiler, transcompiler or transpiler is a type of compiler that takes the source code of a program written in one programming language as its input and produces the equivalent source code in another programming language. A source-to-source compiler translates between programming languages that operate at approximately the same level of abstraction, while a traditional compiler translates from a higher level programming language to a lower level programming language.”

Here is the part that NativeScript has in its tooling section:

Well, considering how Typescript plays an important role nowadays, transpiling will be a term that we hear a lot.

#typescript #angular #nativescript

The 7 Steps of Machine Learning

Machine learning simply Yufeng explains…

Visit the whole playlist too: