R Shiny: Introduction

Shiny is an R Studio application that allows you to how host your R code on a website. This is a great method for creating a dashboard or deploying a predictive model. The best part is, it is free. There are, however, paid services associated with R Studio and Shiny if you want to get professional support and such.

In this lesson, we are going to learn to build a Shiny server application step by step.

Let’s start at the beginning. First off, to use Shiny, you need to have R Studio installed on your computer. Once you have that, you will need to install the Shiny package


Next, we are going to learn about the three parts of a Shiny server application.

Below is the basic framework of a Shiny app:


ui <- fluidPage("Test")
server <- function(input, output) {}
shinyApp(ui = ui, server = server)

Let’s break this down:

ui: ui is the user interface. This is the code that makes up the web page the user will launch and interact with. Anyone familiar with HTML, this is the basic markup. This is where you set the wording, font, colors, what kind of object will be on the page, etc

server: server is the brains of the operation. This is where you load and manipulate any data being used in your page. Again, if you are a web developer, think Java Script here.

shinyApp: this is the code that launches your App.

If you run the code above, you will see something like this


Note the IP address in the upper left. If you copy this into a browser of your choice, it will run there too.


You will see the same address in the R Console. As long as the R Shiny server is running, you will see this Listening command. For those not familiar, the address is called a pingback address. This is the internal address of the NIC card on your machine. This means the server is running on the machine you are on. The :4654 is the port my computer grabbed. This can be different for you. The easiest way for me to explain the port without going too deep into network theory is: think of it like a TV channel. From your computer, you could technically be running hundreds of these R Shiny servers. The port (TV channel) is what tells the computer which one of these web pages you want to visit.

If you happen to be on a network, you can go to another computer and type in your host computers network IP address (you can get it by going to a cmd console and typing ipconfig). It will be the address I have the arrows pointing to in my example


Then, using my example, type into the second computer’s browser and you should see your Shiny website.

Step 2

Let’s see if we can make our website more interesting, let’s start easy, we will format the text.

First close out your shiny window to end the session.

Change your ui line to match the one below.

ui <- fluidPage(h1(“Test”))

Now run your code again

ui <- fluidPage(h1(“Test”))
server <- function(input, output) {}
shinyApp(ui = ui, server = server)


Step 3

Okay, let’s make it a little more interesting by adding some detail to our page. We will do so by using panel methods in our fluidPage()


ui <- fluidPage(
  titlePanel("Title Panel"),
     sidebarPanel("Sidebar Panel"),
      mainPanel("Main Panel")
server <- function(input, output) {}
shinyApp(ui = ui, server = server)

Notice your page is now segmented into a title bar, a sidebar and a main panel


Okay, now let’s actually add some R data magic to this.

mpgData <- mtcars

ui <- fluidPage(
  titlePanel("Mpg vs Cylinder Plot"),
     sidebarPanel("Sidebar Panel"),
      mainPanel(  plotOutput("mpgPlot"))
server <- function(input, output) {
  output$mpgPlot <- renderPlot({
    boxplot(mpg~cyl, data= mpgData)})
shinyApp(ui = ui, server = server)

Let’s break down the new code. First thing first, we are loading the built in datasets library and grabbing the mtcars data set from it.


mpgData <- mtcars

Next we are going to skip down to the server section next as this is where we work with our data:

First thing you will notice is function(input, output) – this is a standard call that allows you to pass information back and forth through the server code. In our example, we are only using the output right now.

We create a variable called output$mpgPlot to which we pass the shiny method renderPlot.  Inside that method, we are simply creating a simple boxplot comparing mpg to cyl using the mtcars data set that we labeled mpgData

server <- function(input, output) {
  output$mpgPlot <- renderPlot({
    boxplot(mpg~cyl, data= mpgData)}) 

The boxplot we rendered as output$mpgPlot is now passed to our ui

Next, in the ui I renamed the title panel and in the mainPanel, I added the command plotOutput(“mpgData”). That will allow me to post the boxplot I created in the server to the ui web page.

ui <- fluidPage(
  titlePanel("Mpg vs Cylinder Plot"),
     sidebarPanel("Sidebar Panel"),
      mainPanel(  plotOutput("mpgPlot"))

If you run the code, it should look like this:


Step 4

Now let’s add a little interactivity to our web page. We are going to give the user the option to view the boxplots with or without the outliers taken into effect.

mpgData <- mtcars

ui <- fluidPage(
  titlePanel("Mpg vs Cylinder Plot"),
     sidebarPanel(checkboxInput("outliers", "Show outliers", FALSE)),
server <- function(input, output) {
  output$mpgPlot <- renderPlot({
    boxplot(mpg~cyl, data= mpgData,
            outline = input$outliers)})
shinyApp(ui = ui, server = server)

This involves only two changes to our code

First we will add a checkbox to our sidebar, in the ui we will alter the sidebarPanel line to look like this

sidebarPanel(checkboxInput(“outliers”, “Show outliers”, FALSE)),

The breakdown of the syntax here is

checkboxinput(variable name, display text, default )– our default is set to FALSE so our box will be unchecked by default


Next, we will alter our boxplot to take into account the checkbox input

boxplot(mpg~cyl, data= mpgData,
            outline = input$outliers)

— notice the variable name is input$outliers  It is a composite of the variable from our checkboxinput method and “input” from out function(input, output) function that opens up our server.

Now run your code






SSIS Lesson 3 – Derived Column

SSIS Lesson 3 – Derived Column

In this lesson, we are going to take the first basic step towards building a data warehouse. We are going to create a historical table from the table we built in Lesson 2. If you didn’t do Lesson 2, you can find the SQL needed to build the table at the bottom of this lesson.

If you remember the table we built in Lesson 2, we had a list of students and the work teams they were assigned to. In lesson 2, this list was designed to change every week, and we only wanted to know what the current week data was.  However, now Principal Skinner has asked if we can keep a listing of all the weeks so he can go back and see how many times someone was assigned to one of the work teams.

The table below is our starting table.


What we want to do is copy the table’s contents into a new historical table and mark it somehow so we know what week each kid was on each team. We are going to do this by creating a new column called INSERT_DT which will record the date the data was added to the historical table.

First things first, let’s build our historical table in SQL Server

       [STUDENT_NM] [varchar](50) NULL,
       [STUDENT_AGE] [int] NULL,
       [STUDENT_TEAM] [varchar](50) NULL,
       [INSERT_DT] [datetime] NULL

Now go into SSIS and go to your Training Project

Create a new Package, I called mine Lesson_3


Drag a Data Flow object onto your design window and click on it


Now drag an OLE DB Source box over and click on it


Select dbo.Student_Name from the drop down and click OK


Now we are going to use a new object – Derived Column, drag it over and connect the Blue arrow from the bottom of the OLE DB Source box to it

Click on Derived Column: Purple Arrow (far left) – Name your new column, I named mine INSERT_DT

Green Arrow (Middle) : in the Expression column, type GETDATE()

Yellow Arrow (far right) : once you tab out of the Expression column, you will see your Data Type changed itself to be a timestamp

Red Arrow (Bottom) : hit Okay


What we just did was create a new column called INSERT_DT which will use the GETDATE() method to get the current date and time. Next we are going to use this column we just created.

Grab and OLE DB Destination and drag it over. Connect it with the outbound arrow from the Derived Column box


Select the new dbo.STUDENT_TEAM_HIST table we created and next click on Mappings


You will see you now have four columns feeding into your new destination table. If you go back and look at your source table, you will see you still only have 3 columns there. The fourth column is the one we just derived.

Hit okay, now we are ready to run the package


Right click on Package and hit Execute Package


It should run. Notice the little comments telling you how many rows moved.


Go to SQL Server and check your new table. You will see the INSERT_DT


Okay, now to see the historical data in action

Let’s change Micah’s team from Red to Blue in our STUDENT_TEAM table

update [dbo].[STUDENT_TEAM]
set STUDENT_TEAM = 'Blue'
where STUDENT_NM = 'Micah'

select *


Go back to SSIS, hit the red box at the top to end debugging mode


Right click and execute the package again

Now run your query again. Note the two different teams Micah was on are listed. Along with the timestamps so you can tell when he was on which team.


SQL Query to build table from lesson 2

       [STUDENT_NM] [varchar](50) NULL,
       [STUDENT_AGE] [int] NULL,
       [STUDENT_TEAM] [varchar](50) NULL

insert into dbo.STUDENT_TEAM
       ('Bob', 15, 'Red'),
       ('Claire', 17, 'Blue'),
       ('Chris', 17, 'Blue'),
       ('Candice', 14, 'Red'),
       ('Holly', 16, 'Blue'),
       ('Stephanie', 16, 'Blue'),
       ('Joshua', 14, 'Red'),
       ('Sherrie', 18, 'Blue')

SSIS: Lesson 2 Import data from CSV into database

SSIS: Lesson 2

Import data from CSV into database

In the first lesson we exported data from a table in our SQL Server database to a CSV file. Today we are going to do just the opposite, we are going to bring data from a CSV file into our database.

I like to try to provide close to real world scenarios when I make my tutorials. Obviously in the simpler intro lessons, that can be difficult. That being said, for this lesson, imagine you manage a database for a school. Every week 9 students are selected to be on either the Red Team (Hall Safety Monitors) or the Blue Team (Lunch Room Aids). Every Friday, the Vice Principal Smith picks the student names at random and puts them into a CSV file that is emailed out to all the teachers. You point out that if you could just get the names into your database, you could put the information on the school’s main Intranet page, so the teachers don’t need to download and open the same CSV file 300 times a week.

So our task is to upload the new weekly CSV file into the database every Friday at the end of the school day.

Here is this week’s CSV file:


First, before heading into SSIS, we need to create a destination table in our database for the CSV file. Here is the file we want to import into our database, you’ll note it has three columns, two columns are strings and one is an integer:


So go into SSMS (SQL Server Management Studio) and run the following query to create the table:

 STUDENT_NM varchar(50),
 STUDENT_TEAM varchar(50))

This code will create an empty table with 3 columns named STUDENT_NM, STUDENT_AGE, STUDENT_TEAM.

Now you can minimize SSMS and open up Visual Studios Data Tools

Once open, go into your Training Project and create a new SSIS Package. I’m naming this one Lesson_2


Now grab an Execute SQL Task from the SSIS Toolbox and drag it to the Design window.

Notice the red arrow at the bottom, there is already a Connection set up for the Sandbox database. This is because when we made this connection in Lesson_1, we made it by clicking on the Connection Managers in the Solution Explorer window.



This creates a (Project) connection – one that can be seen in every package. If you only want your connection to exist inside a single package (this becomes important as the number of packages you have grows, otherwise you’d have thousands of connections in Solution Explorer) – you can right click anywhere inside the Connection Managers box at bottom of the design window.


But this particular package will work fine just using the existing connection, we can move on without creating a new one.

Double Click on your Execute SQL Task icon in the design window.

In the window that pops up, you can rename your Task, provide a description of the task if you would like. This does come in handy when you have to go back to old package. It makes it easier to understand what is going on.

Down at the green arrow in the picture, you’ll have to selection a Connection from the drop down. Luckily were currently only have one to choose from.


Next go to SQL Statement and click on the little ‘…’ box that appears in the right corner.


In the pop up window, this is where you enter your SQL Query. In this case were going to delete all data from table in this box. The reason for this step is so that last week’s students won’t still be in the table on Monday morning. We only want the current students in our database.


Now click Okay on this window and click OKAY again to get back to the Design Window

Next drag over a Data Flow Task. Connect the Create Table task by dragging the arrow over from the Create Table task. Note the direction of the arrow, this is the order in which tasks will run.


Now open Data Flow Task. In the SSIS Toolbox, go to Other Sources and drag a Flat File Source over to the Design Window.

Double click on your new Flat File Source box and a new window will open. Click on New… next to Flat File Connection Manager


On the new window, click Browse


Browse to your CSV File. I created a folder called SSIS Training to make it easy to store all my files.


If you click on Columns in the left window, you can see what the CSV file contains.


Finally, before existing this window, Click back on General and make sure Column names in the first data row is checked.


Now go back to the Design window and drag an OLE DB Destination box onto the design window. Connect the blue arrow from the Flat File Source to the OLE DB Destination


Now double click on OLE DB Destination

Select STUDENT_TEAM from Name of table or the view


Now Click on Mappings


Everything should line up nicely since we named the table columns to match the CSV file.


Now click okay to go back to the Design Window, Click on Control Flow in the upper left to get back to the main page of the package


Right click the package and click Execute Package


If all goes well, you should end up with two green checks


If this happens, check to make sure your CSV file is not currently open, that can cause errors. If so, close the file and try executing the package again


Finally, check the table in your SQL Server to see if it populated


If you want to see how it would work the next week, go into the CSV file and change the data. When you run the package again, the names in the SQL table will change too.


SSIS: Lesson 1 – Export to CSV

SSIS Tutorial Lesson 1

Exporting Data to a CSV file

One of the main purposes of SSIS is moving data from one location to another with ease. In this lesson we are going to export data from our SQL Server database to a CSV file.

If you want to follow along, you can use the following code to create the data table I will be working with.  I loaded this into a database I created for practice called Sandbox.

CREATE TABLE employee_id (        
             emp_nm nvarchar(30) not null, 
            emp_id nvarchar(8),
             b_emp_id nvarchar(8)  
     PRIMARY KEY(emp_nm) );
INSERT INTO employee_id       
       (emp_nm, emp_id)
       ('Bob', 'A1234567'),
       ('Lisa', 'A1234568')
INSERT INTO employee_id       
        (emp_nm, b_emp_id)
        ('Priyanka', 'B1234567');

We will start by opening the Training_Project we created in the intro lesson (SSIS Tutorial: Introduction )and creating a new package.


I renamed my new package Lesson_1


Now we need to make a  new connection so that SSIS knows what our data source will be communicating with. To do so, go to Solution Explorer, right click on Connection Manager and select New Connection Manager


Since we will be connecting to a SQL Server, select OLE DB from the list below


Since there are no existing connections to pick from, choose New


Okay, starting from the top select Native OLE DB\SQL Server Native Client 11.0  (note you might have 10.0 – that will work as well)

Since my SQL Server is locally installed on my machine, I am using Localhost as Server Name, otherwise provide the server name here.

Select your database from the drop down, again my database is Sandbox

Finally, hit Test Connection, you should get a connection successful message box.

Click Okay


You’ll see your new connection in the box, now click Okay


Now at the bottom of your Design Window, you’ll see your new connection in the box labeled Connection Managers


So next, we need to go to the SSIS Toolbox and drag a Data Flow Task over to the designer


Once in the designer, click on the Data Flow Task box, this will bring you to the Data Flow window


Data Flow is used whenever you need to move data between disparate systems. Since we will be moving data from SQL Server to a CSV file, we need to use a Data Flow Task

You should note that the SSIS Toolbox has changed, offering up new Data Flow related tools.

Scroll down to Other Sources and drag OLE DB Source to the design box


Double click on the new object


Make sure your Connection manager you just created in the top drop down. Leave Data access mode at Table or view and select dbo.employee_id as your table


If you click Preview in the bottom left, you will get a pop up of the data in the table


If you click Columns in the upper left, you will see the columns that will be exported from the table. You can change the Output Column names if you want to use a different name in your output file.

We will skip over Error Output for now, as that is easily an entire lesson all its own.


Now go back to the SSIS toolbox and under Other Destinations, click on Flat File Destination and drag it over to the design window.


Drag the blue arrow from the OLE DB source box to the Flat File Destination Box


It should look like this when done


Now click on Flat File Destination

Since we don’t have a current Flat File Connection, we will need to click on New to create one.


Select Delimited and click OK


Find a folder you want the file to end up in. Select CSV files from the bottom right drop down, and name your file.  Click OK  (Note, use a name of a file that does not currently exist. This will create the file)


Check the box: Column names in first data row


If you click on Columns in the upper left, you will see the names of your header columns.

Click Okay to go back to the Flat File Destination Window


If you click on Mappings, you will see you have 3 columns from your source going to three columns in what will be the new CSV file.  In future lessons I will show how you can use this to match up different named columns.

Click Okay to return to the design window


Go to Solution Explorer, right click on the package and click Execute Package


You should get green check marks to indicate success. This is a very small package, so they probably turned green instantly. In larger jobs, the green check will first be a yellow circle to indicate progress. It turns green when that step is complete.

Note on the connecting line between Source and Destination that you get a read out of how many rows were processed.


Go to the folder you chose as your destination, you will see your new CSV file there


Open the file, you will see your data has been exported into your CSV file


SQL: Check if table exists

To check if a table exists in SQL Server, you can use the INFORMATION_SCHEMA.TABLES table.

Running the following code, produces the results below:


Select *


You can use this table with an IF THEN clause do determine how your query responds whether or not a table exists.

           WHERE TABLE_NAME = N'employee_id')
  PRINT 'Yes'


           WHERE TABLE_NAME = N'employee_ids')
  PRINT 'Yes'


  PRINT 'No'


One of the more common uses I find for this when I need to create a table in a script. I want to make sure a table with same name doesn’t already exist, or my query will fail. So I write a query like the one below.

           WHERE TABLE_NAME = N'employee_id')
  drop table employee_id


 CREATE TABLE employee_id (
        emp_nm nvarchar(30) not null,
             emp_id nvarchar(8),
             b_emp_id nvarchar(8)
             PRIMARY KEY(emp_nm) );

INSERT INTO employee_id
       (emp_nm, emp_id)
       ('Bob', 'A1234567'),
       ('Lisa', 'A1234568')

INSERT INTO employee_id
       (emp_nm, b_emp_id)
       ('Priyanka', 'B1234567');

SQL: Coalesce

Coalesce is a simple but useful SQL function I use quite often. It returns the first non-null value in a list of values.

A common real world application for this function is when you are trying to join data from multiple sources. If you work for a company known for acquiring other companies, your HR tables are bound to be full of all kinds of inconsistencies.

In this example below, you will see a snip-it of an HR table created by merging tables from two companies. The first two employees have an emp_id and the last one has a b_emp_id. The last one, Priyanka, is an employee from a newly acquired company, so her emp_id is from the HR table of the newly acquired company.


So if you just want to merge the 2 emp_id columns into one, you can use the coalesce() function.

The syntax for coalesce is: coalesce (col1,col2,….)  as alias

Coalesce takes the first non-null. In the example below, we check emp_id first, if it has a value, we use that. If emp_id is null, we look for the value in b_emp_id.

select emp_nm,
coalesce (emp_id, b_emp_id) as emp_id
from employee_id


If you want to try this for yourself, here is the code to build the table out for you.

CREATE TABLE employee_id (
        emp_nm nvarchar(30) not null,
             emp_id nvarchar(8),
             b_emp_id nvarchar(8)
             PRIMARY KEY(emp_nm) );
INSERT INTO employee_id
       (emp_nm, emp_id)
       ('Bob', 'A1234567'),
       ('Lisa', 'A1234568')
INSERT INTO employee_id
       (emp_nm, b_emp_id)
       ('Priyanka', 'B1234567');


SSIS Tutorial: Introduction

SSIS: Introduction

SQL Server Integration Services is the ETL tool for the Microsoft SQL Server platform. SSIS allows you to take data from various sources (from Excel files, to text files, to other databases, etc), and bring it all together.

If you are new to concept of ETL, SSIS is great place to start. Click here to learn about ETL

If you are well versed in another ETL platform, SSIS is a relatively easy system to get up to speed on.

SSIS comes as part of SQL Server Data Tools, which you should be able to install with your SQL Server installation software. You will need SQL Server Standard, Developer or above editions to run SQL Server Data Tools. SQL Server Express does not support Data Tools.

For some unknown reason, once it is installed, you will not find a program called SSIS. Maybe the engineers at Microsoft think this is funny, but in order to run SSIS you will need to look for the following program instead.


I’m using 2015, but for most of what I am doing here, any version should be compatible.

When you launch data tools, you will notice it run in Visual Studios


To start an SSIS job, you will either need to open an existing project, or create a new one. In this example, I will create a new one.

File ->New -> Project  (or Ctrl+Shift+N)

Inside the Business Intelligence Templates, select Integration Services. I always just select Integrations Service Project, I’m not a big fan of the Wizard


Next step: Name your project


Now you are in SSIS. Here are the 3 main windows you will be starting with.

From right to left:

SSIS Toolbox


Package Designer


Solution Explorer



Inside solutions, packages are the collections of jobs or scripts found inside a project. It is inside the package that you will build out your ETL job.

For our first lesson, we are just going to build a simple package. Your new solution should have opened with a new package when it opened. If it is, right click on the green arrow to rename it, if not, right click on the red arrow to create a new package.



I renamed my package First_Package, you can name your package whatever you choose.

This first package will simply just display a pop up message. In the SSIS Toolbox, go to Script Task and drag it into the package designer window.



Double click on the Script Task Box in the Design window

Note in my example, I have C# set as the scripting language. The other default option is Visual Basic. If you are more comfortable with that, feel free to use it. I prefer C# mainly because I spent more time working with it.

You don’t need to know any C# for this tutorial. This is literally a single line of code assignment. I will cover more C# in the future.

Click Edit Script… to continue


Note, this step can take a minute or so for the script editor to appear. Don’t panic if your computer appears locked up.

Once the script editor opens, don’t panic by all the code you see. Luckily Microsoft has done most of the ground work for us. We only need to scroll down until you see:

public void Main()

Now place your cursor below //TODO: Add your code here

The code you need to type for this script is:

MessageBox.Show(“This is my first SSIS Package”);


Now I know you will be looking for a save button. But again, our friends at Microsoft might have been drinking when they coded this. Instead, just click the upper right X to close out the whole window –

I know – why would they do it that way? how much effort would a save and close button have cost them? I don’t know. It just is what it is. Just click the X and move on with your life.


Now click the OK button – again I guess Save was too much to type


Now right click on your package (green arrow) and click Execute Package


Your message will pop up in a Message box window


Click OK on the messagebox and Click the red square to end the package execution.


Congrats, you have just built and executed your first SSIS Package.


XML Parsing: Advanced SQL

If you want to play along with the lesson, use the following code to create the table I will be using:

CREATE TABLE employee_lang (
        emp_nm nvarchar(30) not null,
             lang nvarchar(255),
             PRIMARY KEY(emp_nm) );
INSERT INTO employee_lang
       (emp_nm, lang)
       ('Bob', 'Python, R, Java'),
       ('Lisa', 'R, Java, Ruby, JavaScript'),
       ('Priyanka', 'SQL, Python');


XML Parsing

XML parsing is a SQL method for separating string data found in a single field in a table. Look at the table below:


This table has two columns, emp_nm (employee name), lang (programming languages the employee is proficient in). Notice that the column lang has multiple values for each record. While this is easily human readable, if you want it to more machine usable (think Pivot tables or R statistical analysis), you are going to want your data to look more this this:


Notice now the table has the same two columns, but the lang column now only has 1 word per record. So the question is, how do you do this using SQL?

XML parsing

The code used to “parse” out the data in the lang column is below:

SELECT emp_nm
,SUBSTRING(LTRIM(RTRIM(m.n.value('.[1]','varchar(8000)'))),1,75) AS lang
,CAST('<XMLRoot><RowData>' + REPLACE([lang],',','</RowData><RowData>') + '</RowData></XMLRoot>' AS XML) AS x
FROM employee_lang
CROSS APPLY x.nodes('/XMLRoot/RowData')m(n)

Let’s break it down a little first.

,CAST('<XMLRoot><RowData>' + REPLACE([lang],',','</RowData><RowData>') + '</RowData></XMLRoot>' AS XML) AS x
FROM  employee_lang

What we are doing with the code about is using a CAST and REPLACE functions to convert the elements in column lang into a XML line.  See results below:


For Bob, this is the result of the CAST/REPLACE code on the lang column

<XMLRoot><RowData>Python</RowData><RowData> R</RowData><RowData> Java</RowData></XMLRoot>

This is how the code works from the inside out.


select REPLACE (‘SQL ROCKS’, ‘S’, ‘!’)

If you run the above code, it will return !QL ROCK!

Replace is saying — everywhere an S is in the string, replace it with a !


The cast function is casting the string as an XML data point. This is needed for the next section, when we unpack the XLM string

Cross Apply / Value

SELECT emp_nm
,CAST('<XMLRoot><RowData>' + REPLACE([lang],',','</RowData><RowData>') + '</RowData></XMLRoot>' AS XML) AS x
FROM employee_lang
CROSS APPLY x.nodes('/XMLRoot/RowData')m(n)

So without going into way too much detail, you can find pages dedicated to Cross Apply and Value(), I’ll give you the quick breakdown.

First notice we aliased our CAST() statement as x.  So if you look at the CROSS APPLY, you will see we are asking to look at x.nodes. Had we aliased our cast y, we would be looking at y.nodes.

Now look at m(n) at the end of the line. This is like an array or list in programming languages. Keep that in mind for the next step.

Inside the x.nodes() is ‘/XMLRoot/RowData’ , this is telling us to assign to m everything following /XMLRoot and to iterate n by everything following /RowData, so for Bob:

m(n=1) = Python

m(n=2) = R

m(n=3) = Java

Now we pass that array m(n) to our Value() method.  Hence m.n.value().  Note m and n were just letters I picked, you can use others.

Inside m.n.value(‘.[1]’,’varchar(8000)’) was used as it should pretty much cover any size string you may have to deal with.

So the final iteration simply add as SUBSTRING to clean it up and get rid of white space