Azure SQL and PowerBI

Someone I have followed on Twitter for a long time, Paul Andrew and was lucky to meet in person this year posted a Tweet about his speaking stats and how he'd visualised it in a PowerBI Dashboard.

Which I thought looked absolutely amazing, so I set about copying what he'd done for my own speaking involvement. I only started public speaking in August 2017, so I don't have quite the same cool stats that Paul has but I think it would be really cool to capture what I've done so far and see where I go in the future. Plus I get to play with some Azure and PowerBI deploying the solution.

Getting Started

In order to implement Paul's solution you need a SQL database and access to PowerBI. You can use any SQL database that you have available, I am going to implement my solution with an Azure SQL database. I only need a very small database as it will only be used sporadically and won't store much data.

Deploy an Azure SQL Database

To create my SQL server within Azure I have decided to use Azure CLI and the Azure Cloud Shell

Below is the Azure CLI code that I used:

# Set an admin login and password for your database
export adminlogin=ServerAdmin
export password=adminpassword
# The logical server name has to be unique in the system
export servername=SpeakingStats
# The ip address range that you want to allow to access your DB
export startip=0.0.0.0
export endip=0.0.0.0
# Set the Resource Group Name
export rgname=rg-speakinglogs
# Set the Location for the resources to be deployed to
export datacenter=eastus

# Create a resource group
az group create \
    --name $rgname \
    --location $datacenter
    --tags 'Usage=Speaking'

# Create a logical server in the resource group
az sql server create \
    --name $servername \
    --resource-group $rgname \
    --location $datacenter \
    --admin-user $adminlogin \
    --admin-password $password

# Configure a firewall rule for the server
az sql server firewall-rule create \
    --resource-group $rgname \
    --server $servername \
    -n AllowYourIp \
    --start-ip-address $startip \
    --end-ip-address $endip

# Create a Basic database
az sql db create \
    --resource-group $rgname \
    --server $servername \
    --name tagdb \
    --collation SQL_Latin1_General_CP1_CI_AS \
    --service-objective Basic \
    --tags 'Usage=Speaking'

The above code will deploy a Resource Group, a Logical SQL server, a SQL database and configure it so that your client (desktop) IP address will be allowed to connect with the SQL server.

Now that you have the SQL Server and database created you need to connect to it and start adding data.

Connecting to your Azure SQL Database

Connecting to your Azure SQL server or database is no different than how you would interact with an on-premises SQL server/database. You can use SQL Server Management Studio (SSMS), Visual Studio Code, Visual Studio, or another tool that you like to use for SQL interactions. I spent a lot of time within Visual Studio Code, so this is going to my connection tool.

In order to connect to SQL using Visual Studio Code you need to have the SQL Server(mssql) extension installed. The Microsoft Official Documentation covers off how to install this extension if you need some assistance.

Once that is installed to connect to the SQL database you created earlier follow these steps:

  • Open a new file either through the GUI or using CTRL+N
  • By Default Visual Studio Code creates a new file in Plain Text, we need to convert this to SQL. Press CTRL+K,M to change it to SQL
  • Now you want to connect to the SQL server, do to his press CTRL+SHIFT+P to launch Visual Studio Code's Command Palette
  • Type sqlcon and select the MS SQL: Connect option
  • Select Create Connection Profile
  • The first prompt will ask for your SQL server name, this will be name of the SQL server you created earlier followed by ".database.windows.net"
  • The second prompt will ask for the database name, again this is the name of the database you created earlier
  • The next two prompts are related to the username and password you set earlier
  • The next prompt will ask you if you want to save the password so you won't be prompted again for this information, configure as appropriate
  • The last prompt ask you to specify a name for the connection profile

Visual Studio Code will now start to connect to your SQL server, you'll see progress in the bottom right hand corner, it will say connecting and then change to the SQL server name once connected.

Now you are interacting with the database you need to configure the database.

Setting up the Database

Azure SQL

Paul has shared some SQL scripts within his GitHub repo

I've been a bit selective with the SQL that I've used from Paul's below is what I used to configure my database:


ALTER DATABASE [SpeakingLogs] SET ANSI_NULL_DEFAULT OFF 
GO
ALTER DATABASE [SpeakingLogs] SET ANSI_NULLS OFF 
GO
ALTER DATABASE [SpeakingLogs] SET ANSI_PADDING OFF 
GO
ALTER DATABASE [SpeakingLogs] SET ANSI_WARNINGS OFF 
GO
ALTER DATABASE [SpeakingLogs] SET ARITHABORT OFF 
GO
ALTER DATABASE [SpeakingLogs] SET AUTO_SHRINK OFF 
GO
ALTER DATABASE [SpeakingLogs] SET AUTO_UPDATE_STATISTICS ON 
GO
ALTER DATABASE [SpeakingLogs] SET CURSOR_CLOSE_ON_COMMIT OFF 
GO
ALTER DATABASE [SpeakingLogs] SET CONCAT_NULL_YIELDS_NULL OFF 
GO
ALTER DATABASE [SpeakingLogs] SET NUMERIC_ROUNDABORT OFF 
GO
ALTER DATABASE [SpeakingLogs] SET QUOTED_IDENTIFIER OFF 
GO
ALTER DATABASE [SpeakingLogs] SET RECURSIVE_TRIGGERS OFF 
GO
ALTER DATABASE [SpeakingLogs] SET DATE_CORRELATION_OPTIMIZATION OFF 
GO
ALTER DATABASE [SpeakingLogs] SET ALLOW_SNAPSHOT_ISOLATION ON 
GO
ALTER DATABASE [SpeakingLogs] SET PARAMETERIZATION SIMPLE 
GO
ALTER DATABASE [SpeakingLogs] SET READ_COMMITTED_SNAPSHOT ON 
GO
ALTER DATABASE [SpeakingLogs] SET  MULTI_USER 
GO
ALTER DATABASE [SpeakingLogs] SET ENCRYPTION ON
GO
ALTER DATABASE [SpeakingLogs] SET QUERY_STORE = ON
GO
ALTER DATABASE [SpeakingLogs] SET QUERY_STORE (OPERATION_MODE = READ_WRITE, CLEANUP_POLICY = (STALE_QUERY_THRESHOLD_DAYS = 30), DATA_FLUSH_INTERVAL_SECONDS = 900, INTERVAL_LENGTH_MINUTES = 60, MAX_STORAGE_SIZE_MB = 100, QUERY_CAPTURE_MODE = AUTO, SIZE_BASED_CLEANUP_MODE = AUTO)
GO
USE [SpeakingLogs]
GO
ALTER DATABASE SCOPED CONFIGURATION SET BATCH_MODE_ADAPTIVE_JOINS = ON;
GO
ALTER DATABASE SCOPED CONFIGURATION SET BATCH_MODE_MEMORY_GRANT_FEEDBACK = ON;
GO
ALTER DATABASE SCOPED CONFIGURATION SET ELEVATE_ONLINE = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION SET ELEVATE_RESUMABLE = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION SET GLOBAL_TEMPORARY_TABLE_AUTO_DROP = ON;
GO
ALTER DATABASE SCOPED CONFIGURATION SET IDENTITY_CACHE = ON;
GO
ALTER DATABASE SCOPED CONFIGURATION SET INTERLEAVED_EXECUTION_TVF = ON;
GO
ALTER DATABASE SCOPED CONFIGURATION SET ISOLATE_SECURITY_POLICY_CARDINALITY = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION SET LEGACY_CARDINALITY_ESTIMATION = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET LEGACY_CARDINALITY_ESTIMATION = PRIMARY;
GO
ALTER DATABASE SCOPED CONFIGURATION SET MAXDOP = 0;
GO
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET MAXDOP = PRIMARY;
GO
ALTER DATABASE SCOPED CONFIGURATION SET OPTIMIZE_FOR_AD_HOC_WORKLOADS = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION SET PARAMETER_SNIFFING = ON;
GO
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET PARAMETER_SNIFFING = PRIMARY;
GO
ALTER DATABASE SCOPED CONFIGURATION SET QUERY_OPTIMIZER_HOTFIXES = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET QUERY_OPTIMIZER_HOTFIXES = PRIMARY;
GO
ALTER DATABASE SCOPED CONFIGURATION SET XTP_PROCEDURE_EXECUTION_STATISTICS = OFF;
GO
ALTER DATABASE SCOPED CONFIGURATION SET XTP_QUERY_EXECUTION_STATISTICS = OFF;
GO

Setting up Tables and Views

The next stage is to set up to the tables for storing your speaking data. The SQL query to execute to do that is as follows:

/****** Object:  Table [dbo].[SpeakingLog]    ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[SpeakingLog](
    [LogId] [int] IDENTITY(1,1) NOT NULL,
    [TalkDate] [date] NOT NULL,
    [EventName] [varchar](255) NOT NULL,
    [City] [varchar](255) NULL,
    [Country] [varchar](255) NULL,
    [Postcode] [varchar](20) NULL,
    [Attendance] [int] NULL,
    [Tags] [nvarchar](1024) NULL,
    [TalkType] [varchar](100) NULL,
 CONSTRAINT [PK_SpeakingLog] PRIMARY KEY CLUSTERED 
(
    [LogId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

/****** Object:  View [dbo].[RecordCountByTags]   ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE VIEW [dbo].[RecordCountByTags]
AS

SELECT 
    LTRIM(RTRIM(VALUE)) AS Tag,
    COUNT(*) AS RecordCount
FROM 
    [dbo].[SpeakingLog]
    CROSS APPLY STRING_SPLIT([Tags],',')
GROUP BY
    VALUE

GO
/****** Object:  Table [dbo].[EventLogos]  ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[EventLogos](
    [EventName] [varchar](255) NOT NULL,
    [ImageURL] [nvarchar](max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
USE [master]
GO
ALTER DATABASE [SpeakingLogs] SET  READ_WRITE 
GO

Entering your Data

Now that you've got the infrastructure set up, it's time to start adding your speaking data.

You have two tables, one that holds all your speaking stats (SpeakingLog) and the other table (EventLogos) is used to store the location of the logos of events you've spoken at.

Inserting the data in the SpeakingLog table is quite easy, I've done a bulk import of the speaking data from last year and this year. Using code as below:

Insert into [dbo].[SpeakingLog] 
Values 
('11/08/2018', 'STEM', 'Reading', 'United Kingdom', 'RG6 1WG', '25', 'Azure', 'Regular'),
('08/30/2017', 'Azure UG', 'Edinburgh', 'United Kingdom', 'EH1 3EG', '40', 'Azure', 'Lightning Talk');

You can view the results of your data input with this command:

Select * from [dbo].[SpeakingLog]

The next part of the data input is to add in the logos of the events you've attended, again a bulk import of this would look like this:

Insert into [dbo].[EventLogos]
VALUES
('STEM', 'https://www.stem.org.uk/sites/all/themes/custom/stem_base/assets/img/patterns/logo-stem-new.svg'),
('Microsoft Cloud User Group', 'https://www.mscug.com/wp-content/uploads/2018/02/MSCUG-logo.jpg');

And you can view the results of your input with this command:

Select * from [dbo].[EventLogos]

Be sure to match up the EventName you've inputed into the SpeakingLogs table with the EventName you use in the EventLogos table. If they don't match your data visualisation within PowerBI will be off.

Visualise the data in PowerBI

You will need PowerBI Desktop installed on your machine for this next stage. You can download the latest version form here

PowerBI is something I've dabbled with in the past and delivered solutions within for customers but it isn't something I'm an expert in. Paul has shared his .PBIX (PowerBI file) on his GitHub page here

I downloaded the .PBIX file and opened it within PowerBI Desktop. The first thing to do is change the Data Source within the .PBIX file to your SQL database. To do this follow these steps:

  • Click on Edit Queries > Data Source Settings on the top toolbar
PowerBI Data Source
  • Right click on the data source currently there and select Change Source
  • Input your SQL server name and click OK
  • When you return to your PowerBI Report you'll see a yellow ribbon along the top. Asking you to apply the changes

    PowerBI Apply Changes
  • When you click on the apply changes button, you will be prompted to enter credentials for SQL database. Ensure you click on the Database option and input your username and password
    PowerBI Credentials

PowerBI will connect to your SQL database and start to pull in your data, you should see the report refresh and apply your data.

The next thing to change is the profile pic and contact information. Replace the picture and information with your own.

Once this is done be sure to save your settings. You should end up with a custom report showing your speaking results.

Speaking Report

Overview

I love finding wee gems like this and getting the change to learn something new along the way. Thanks again Paul Andrew for sharing.

Next Post Previous Post