Congressional District Polygons

Here I am again working on my House of Congress/Windows 8/Metro app contest when I’m wanting to show the individual districts each Congress person represents.  Using a Bing search I found myself back at the Census website.  This time looking at the congressional boundary files:  http://www.census.gov/geo/www/cob/cd110.html

I don’t know anything about the e00 or shp files, so I’ll be working with the available ascii files.

Each state’s zip file contains two dat files.  One dat contains the Long/Lat coordinates for various map-polygons which represent districts that are defined in the other dat file.  What I wanted was a way to tie together the dat files with the Districts table I’d already defined at the start of my development.

I need this information to be in the most covenant format for my application, therefore, I’ll be importing the information for each state into my SQL database.  I created a new table defined as follows:image

SNAGHTML6ef6126

Like many of my tables, I’ve setup an Id which auto increments and is the primary key for each row.  Next we have the linked DistrictID, the PolygonID identified within the dat file as well as the Latitude and Longitude values.

First thing was to download each individual ascii file, unblock (windows 7 “feature”), extract and rename each file… only took 20min.

Then using my import program, I follow the following sudo code to get into my database:

foreach DAT file
    read in dat and def files
    extract state name //will use this later to get district reference
    
    open database
    find state
   
    foreach dat file line
        parse Long/Lat and cur Polygon # if available
        if cur Polygon # found, then
            foreach def file line
                if cur polygon # then
                    district id = line
        if have district id and current polygon
            insert new coordinates into database

In the end we have a database called Boundaries that looks something like this:

image

If you have any questions or would like to see the source code for my importer mention it in the comments section below.

XML Deserialization

So, you have an XML document, your programming in Visual Studio 11 for your new Windows 8 Metro app, and you need to access the data within?  I’m here to help…

DE serializing your XML document is very simple.  First you’ll need to define a CLASS object the way your XML document is…

Here’s my example XML document:

<?xml version="1.0" encoding="UTF-8" ?>

<Congress>
  <Regions>
    <Region>
      <Title>Northeast</Title>

      <States>
        <State>
          <Title>Connecticut</Title>

          <Districts>
            <District>
              <Title>District 1</Title>
              <SubTitle>Larson, John B.</SubTitle>
            </District>
          </Districts>
        </State>
      </States>
    </Region>
  </Regions>
</Congress>

This is an example of an XML document I’m currently using in my contest application.  As you can see there can be multiple Regions, States or Districts.  Each organized in sub objects… I am showing you only one of each since it would get pretty long otherwise…

Now, we’ll need a few classes organized just like this XML document:

[XmlRoot("Congress")]
public class clsCongress
{
    [XmlArray("Regions")]
    [XmlArrayItem("Region", typeof(clsRegion))]
    public clsRegion[] Region { get; set; }
}

[XmlRoot("Regions")]
public class clsRegion
{
    [XmlElement("Title")]
    public string Title { get; set; }

    [XmlArray("States")]
    [XmlArrayItem("State", typeof(clsState))] 
    public clsState[] State { get; set; }
}

[XmlRoot("States")]
public class clsState
{
    [XmlElement("Title")]
    public string Title { get; set; }
        
    [XmlArray("Districts")]
    [XmlArrayItem("District", typeof(clsDistrict))]
    public clsDistrict[] District { get; set; }
}

[XmlRoot("Districts")]
public class clsDistrict
{        
    [XmlElement("Title")]
    public string Title { get; set; }
        
    [XmlElement("SubTitle")]
    public string SubTitle { get; set; }
}


As you can see form my class objects, you’ll need to specify the XmlRoot for each class as well as each XmlElement so everything maps correctly.  In my example, the elements match, but they don’t have to so long as they’re mapped with the XmlElement tag.

Now, I know you can find this code anywhere on the net… but what they don’t go into detail explaining is the array variables I have shown here. 

In my example, I have Congress which is made up of multiple Regions, which is made up of multiple States, and Districts.  In my example here, I demonstrate how to get multiple sub-objects pulled in easily and quickly.

Here is the code I use to load up the XML text, oh and I’m downloading this content using the new fancy async features from VS11 and Windows 8:

internal async Task PullRegionsAsync(Uri baseUri)
{
    string baseUrl = "https://danielheth.com/myexample.xml";

    //download the data xml
    //http://msdn.microsoft.com/en-us/library/windows/apps/system.net.http.httpclienthandler.maxrequestcontentbuffersize(v=VS.110).aspx
    //says there is a 65k limit on the size of what this function will download...
    var client = new HttpClient();
    var response = await client.GetAsync(baseUrl + "Data.xml");
            
    //------------------------------------------------
    //convert xml into an easily digestable object
    clsCongress congress = null;
    XmlSerializer des = new XmlSerializer(typeof(clsCongress));
    congress = (clsCongress)des.Deserialize(response.Content.ContentReadStream);
    //------------------------------------------------

    foreach (clsRegion r in congress.Region)
    {
        string RegionTitle = r.Title;

        foreach (clsState s in r.State)
        {
            string StateTitle = s.Title;

            foreach (clsDistrict d in s.District)
            {
                string DistrictTitle = d.Title;
            }
        }
    }
}

There you have it… deserializing your XML document into an easily digestible object within C#.

If you have any questions or comments please leave them below…

Windows 8 Metro Application–Bing Maps

I’ve entered myself into a new Windows 8 application contest.  I’ve come up with a wonderful idea which includes utilizing Bing Maps.  After an awful lot of research I’ve come up with something that works… Perfectly!!!

Since this is a Windows 8 application, I’m writing this in Visual Studio 11 express which utilizes an awesome multi-threading feature called async/await.  So I’ll be taking advantage of that here.

First thing I needed to do was acquire my Bing Map Developers key… Acquire yours free from:  https://www.bingmapsportal.com

Let’s start coding!!!

I followed the following MSDN article to setup my app for access to the Bing SOAP API’s.  http://msdn.microsoft.com/en-us/library/cc966738.aspx

Then added the following to the top of my code:

using HouseofRepresentatives.GeocodeService;
using HouseofRepresentatives.SearchService;
using HouseofRepresentatives.ImageryService;
using HouseofRepresentatives.RouteService;


private ImageSource _map = null;
public String Latitude = "";
public String Longitude = "";


public ImageSource Map
{
    get
    {
        if (this._map == null) UpdateMap();
        return this._map;
    }

    set
    {
        if (this._map != value)
        {
            this._map = value;
            this.OnPropertyChanged("Map");
        }
    }
}

I had two situations for my locations… in one case I knew the exact Lat/Lon, and in others I only knew an address… in my case a state name.  First I needed to turn that address into a Lat/Lon for passing onto the get map function.

private async Task<string> getLocationPoint(string address)
{
    if (address != null && address != "")
    {
        GeocodeRequest request = new GeocodeRequest();
        request.ExecutionOptions = new HouseofRepresentatives.GeocodeService.ExecutionOptions();
        request.ExecutionOptions.SuppressFaults = true;
        GeocodeServiceClient geocodeClient = 
            new GeocodeServiceClient(GeocodeServiceClient.EndpointConfiguration.BasicHttpBinding_IGeocodeService);
        HouseofRepresentatives.GeocodeService.Credentials t = 
            new HouseofRepresentatives.GeocodeService.Credentials();
        t.Token = "[[put your Bing Maps Key Here]]";

        request.Query = address;
        request.Credentials = t;

        GeocodeResponse response = await geocodeClient.GeocodeAsync(request);
        if (response.Results.Count() > 0)
        {
            return response.Results[0].Locations[0].Latitude.ToString() + 
                "," + response.Results[0].Locations[0].Longitude.ToString();
        }
    }
    return "";
}

Next I needed to actually get the Image from Bing.  Now let’s do that using the following function:

private async Task<string> GetImagery(string locationString)
{  //http://msdn.microsoft.com/en-us/library/dd221354.aspx
    string key = "[[insert your Bing Maps Key here]]";
    MapUriRequest mapUriRequest = new MapUriRequest();

    // Set credentials using a valid Bing Maps key
    mapUriRequest.Credentials = 
        new HouseofRepresentatives.ImageryService.Credentials();
    mapUriRequest.Credentials.ApplicationId = key;
                      

    // Set the location of the requested image
    mapUriRequest.Center = new HouseofRepresentatives.ImageryService.Location();
    string[] digits = locationString.Split(',');
            
    mapUriRequest.Center.Latitude = double.Parse(digits[0].Trim());
    mapUriRequest.Center.Longitude = double.Parse(digits[1].Trim());

    // Set the map style and zoom level
    MapUriOptions mapUriOptions = new MapUriOptions();
    mapUriOptions.Style = MapStyle.AerialWithLabels;
    mapUriOptions.ZoomLevel = 17;

    // Set the size of the requested image in pixels
    mapUriOptions.ImageSize = new HouseofRepresentatives.ImageryService.SizeOfint();
    mapUriOptions.ImageSize.Height = 240;
    mapUriOptions.ImageSize.Width = 480;

    mapUriRequest.Options = mapUriOptions;

    //Make the request and return the URI
    ImageryServiceClient imageryService = 
        new ImageryServiceClient(ImageryServiceClient.EndpointConfiguration.BasicHttpBinding_IImageryService);
            
    try
    {
        MapUriResponse mapUriResponse = await imageryService.GetMapUriAsync(mapUriRequest);

        if (mapUriResponse.Uri != null)
        {
            return mapUriResponse.Uri;
        }
        else { return ""; }
    }
    catch (Exception ex)
    {
        return "";
    }
}

Ok… now that I have my image Uri, it’s time to use it to load a picture:

public async Task<bool> UpdateMap(string address = "")
{
    if (address != "")
    {
        //then user is specifying an address instead of a specific long/lat.  
        //We need to get that info in order to proceed...
        string point = await getLocationPoint(address);
        if (point != "")
        {
            string[] p = point.Split(',');
            Latitude = p[0];
            Longitude = p[1];
        }
    }

    if (Longitude != "" && Latitude != "")
    {
        string uripath = await GetImagery(Longitude + "," + Latitude);
        if (uripath != "")
        {
            this.Map = new BitmapImage(new Uri(uripath));
            return true;
        }
        else
        {
            this.Map = new BitmapImage(new Uri(imageBaseUri, "Data/blank.png"));
            return false;
        }
    }
    else
    {
        this.Map = new BitmapImage(new Uri(imageBaseUri, "Data/blank.png"));
        return false;
    }
}

If you’ve stuck with me this far… you are a programmer… LOL.

In each of my functions I utilized the new async/await features.  This means when I call either of the two functions as below, it will load the image once the OS has downloaded and cached it.  The image will automatically appear when ready…

//calling the get-map function with an address
UpdateMap("Arkansas");

//or
UpdateMap("1 Microsoft Way, Redmond, Washington");

//or i can configure my lat/lon then update
Latitude = "39.450000762939453";
Longitude = "-98.907997131347656";
UpdateMap();

If you guys have any questions or comments, be sure to post them below!

Migrating Access Data to SQL 2008r2

I recently spent some time building an Access database file full of 2010 US Census data… As an Access database, it’s not very useful since I want to use the data within as a feed for my Windows 8 Metro application.  I’ll need to move this data into my SQL 2008r2 server to make it available to my application.

To do this I’ll be using the Microsoft SQL Server Migration Assistant (SSMA) toolkit I recently installed.

 

image

Upon first use, I’ll need to acquire a license for the tool as instructed by the included Readme file.

SNAGHTML26d9097

image

image

After filling out the form, they provide me a file which I will through onto a network file share and give to the tool…

image

SNAGHTML27292d9

ok, in the above pic, I didn’t need to specify the filename… just the directory it sat in… then the license imported correctly.

SNAGHTML27426ed

SNAGHTML2766802

SNAGHTML276c8e6

SNAGHTML27702f9

SNAGHTML2774749

SNAGHTML2776831

After specifying the database, I had to give it a minute or so to completely read in and load my database file… the 2010 Census file is right at 100mbs.

SNAGHTML2785904

Then specify my SQL server, destination Database name, and my authentication method… I’m in a domain so I’ll simply use Windows Authentication since my Domain account has the appropriate level of access to the SQL Server.

SNAGHTML2795a3c

Yes Please…

SNAGHTML27a5a9a

SNAGHTML27a9633

I do not have any pre-existing applications which connected to this Access file… thus no need to link anything.

SNAGHTML27afacf

Let the Migration begin!

SNAGHTML27afacf[4]

At this point I could see the connection and the data migration was occurring…

image

SNAGHTML27d947e

back on the server, my tables are now created…

image

On the Migration Wizard, the data is slowly being copied over…

SNAGHTML27dd516

This is where my SQL server is running a tad slow… a lot of data being input into the database…

SNAGHTML27f5b55

SNAGHTML28012e1

Wizard complete, and I can see the newly arrived data on the server.

image

According to the Migration report… there were Warnings… but no Errors.  The Warnings mostly related to database quality… IE:  some tables did not have primary keys, others had their primary key nullable value configured.

SNAGHTML2823fe9

Those tables with nullable primary keys were corrected to not allow null values.

image

Those without primary keys were ignored since SQL server doesn’t consider that a serious problem…

image

That and I’m not worried about those two tables considering they were used during the “Import” process from the Census .dp files and won’t be used again.

It’s time to integrate the new found data into my Windows 8 application!

If you have any questions or comments on this, please leave them below!

Microsoft SQL Server Migration Assistant for Access

Microsoft SQL Server Migration Assistant (SSMA) is a toolkit for quickly and easily converting Access Databases to full fledged SQL Server Databases.  Exactly what I was looking for to move my newly created 2010 Census Access Database to someplace more usable for my purposes.

Download the toolkit at:  http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=959

And thanks goes out to the Access Team over at the Access Blog for pointing me to the right place thanks to a wonderful Bing search.

Here’s the installation of that tool…

SNAGHTML25c80fd

SNAGHTML25cb97b

SNAGHTML25ce951

I have plenty of space and do not like to go back and reinstall things I need… so I’ll do a complete installation.

SNAGHTML25d6e48

SNAGHTML25da253

SNAGHTML25de0f8

SNAGHTML25e09ac

That’s it… checkout my next article where I describe using it to actually migrate Access data to my SQL 2008r2 server.

If you have any questions or comments, please leave them below!

Creating a 2010 Census Database

So you’re interested in the 2010 Census results.  Well they’re publicly available at http://www2.census.gov/census_2010/03-Demographic_Profile/ and with a little bit of work, you can have yourself a fully stocked Access Database with complete 2010 Census data.

Here’s a simple step-by-step using Microsoft Office – Access 2010 to create a database and import the data for the state( s ) you want to collect.  Later I’ll show you how to turn the resulting Access 2010 file into a SQL Database for SQL 2008r2.

Download the “DPSF2010_Access.accdb” file from the url above.  Also download the various .zip files under the states your interested in.  Then open up the Access file and perform the following steps as described in their documentation…

SNAGHTML1c746f7

SNAGHTML1c7c20f

We’ll first import the demographics data… so select the non-“geo” file.

SNAGHTML1c80833

SNAGHTML1c84a61

SNAGHTML1c88b95

SNAGHTML1c8bcd2

Select the “Data Part…” specification since this relates to the Demographics file.  When you repeat this wizard for the Population (geo) file, you’ll need to choose the other “GeoHeader…” option on this screen.

SNAGHTML1c9071a

The specs will automatically change to the correct format.

SNAGHTML1c931e2

SNAGHTML1c95e10

SNAGHTML1c98b37

SNAGHTML1c9adf3

This step is important… so specify the correct primary key.

SNAGHTML1ca1fa9

Name your table… I’m importing everything, so I’ll name it the state and attach Demo onto the end and Pop on the end of the geo file.  This will give me tons of data in two different tables for each state.

SNAGHTML1ca62b1

 

Have fun with data… if you have any questions or comments, please add it below!

Upgrading Servers with Microsoft SQL 2008r2

I have a server whose hard drive is failing and unable to get an image of the hard drive.  This particular server is one of the most important on my network as it is my SQL database server and links to multiple other services:  Microsoft Team Foundation Server, Microsoft Sharepoint, IBM Tivoli Endpoint Manager (BigFix), among several of my personal developments.

Since I have upgraded my virtual host server I plan on running my new SQL server as a virtual  (see: 1, 2, 3) .  I’m assigning 4 cores and 3 vhd files running from multiple hard drives within my virtual host server.  This should serve my applications well.  I’m now left with the problem of migrating my databases from the old failing server to the new one.

The first thing to do is stop all services that rely on the old SQL server. 


Stop my custom vGPS website

First up, my custom virtual GPS website… This website provides an easy way to take someone’s public IP address and performs a database lookup, then returns the previously recorded GPS coordinates.

image


Stop Team Foundation Server

Next up, my Team Foundation Server… here I plan on just shutting down that virtual.  During my migration process I plan on changing the DNS name for my old SQL server to point to the new SQL server.  Thus when TFS starts back up, it shouldn’t see anything change.

image


Stop Tivoli Endpoint Manager services

Finally I’ll need to disable my BigFix infrastructure by stopping all of the related services.

image

I had to stop my BES Client also due to the policies I’ve put into place which monitors for stopped/failed services and restarts them.


While I’ve been working documenting what needs to be stopped, I’ve also been preparing and updating my new virtual SQL server.  Windows Updates, attaching to domain, installing SQL, etc…

Visit my “Installing Microsoft SQL Server 2008r2” for details on how to setup your own SQL server.

Now we’re ready to, backup/detach/move/attach our databases.

Backup Databases

First let’s backup each of the databases that will be migrated over:

Open up SQL Mgmt Studio

12-2-2011 2-13-44 PM12-2-2011 2-14-45 PM12-2-2011 2-15-42 PM

Open up the localhost->Databases and right click on each database->Tasks->Back Up.  The default location should be fine unless your shy on space.  If so, remove the destination at the bottom and redirect to a different location.

image  image

image

Repeat until you have all of the desired databases backed up and safe.

Since I went with the defaults, all of my database backups show up in “C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Backup”.  I will leave them there for now and utilize them if something goes wrong later on.


Detach Databases

Next we’ll need to Detach these databases from the old SQL Server.

imageimage

The database files will remain where they were.  In my case I had them configured to sit in the default directory:

C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA

This process will detach the database from this instance of SQL server and locks the two related files:  [database name].mdf and [database name]_log.LDF

image

I will repeat the detaching process until I get all 10 databases.  1 for my custom vGPS application, 4 related to BigFix, and the 5 related to TFS.

I now have 20 files to be moved over to the new SQL server.  I’ll do this using robocopy.  See my Robocopy how-to in a different article.

I’m going to move them straight over to the new SQL servers data drive… so I used the following robocopy command:image

robocopy ./ "\\vsql\e$\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA" -z –e

After the move, I wanted to position the transaction log files in a different “drive” than the database files… so I selected all of the _log files and moved them to a separate VHD which sits on a different physical RAID array than the OS and Data VHD’s.  This is due to performance reasons related to BigFix.

image


Nearly an hour and a half later…Attaching Databases

image

image

image

image

image

I encountered a SQL error 5123 while selecting my mdf file.  This is because the DATA directory is protected. I solved that by simply exploring to that directory and when asked for “admin” access to it, clicked yes.  Then choosing OK to the above dialog went as expected…

image

Since my Data and Log files reside on two different VHD drives, I needed to specify the path to each file separately.  The previous file selection automatically chose the correct Data file, and since my path matched between the Data VHD and the Logs VHD, I only needed to change the Drive letter.

image

image

Next, repeat that for each of the databases that are being moved over…

image


Transferring Logins

I opted not to perform this and simply recreate the couple of logins I needed.  Since only 3 services are currently using the SQL Server, recreating them was quick and extremely easy.

I also opted to change the DNS alias the three services were connecting to.  On my old server, both BigFix and SQL were running on the same box.  I decided to separate them into two separate virtuals.  vSQL will host my SQL server and allow all three services to connect into it.  vBigFix81 will serve as my BigFix server running version 8.1.  If I want to upgrade to 8.2 or beyond, I’ll simply create upgrade vBigFix81 to get upgrade documentation, then create a brand new vBigFix82 virtual, install 8.2 and change the BigFix DNS alias.

Now onto restoring each of my services…


Restoring Team Foundation Server

It made logical sense to restore this service first.  My vGPS service will need alias’s updated in code which is stored on my TFS.  So…

Turns out this is extremely easy.  I only needed to update the web.config file under “C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services” with the new DNS alias for my SQL server.  I found information on this from http://msdn.microsoft.com/en-us/library/ms404869(v=vs.80).aspx which pointed me to the Web.config for other reasons.  While there I remember for many of my apps, I have to configure the database connection within this file.  And done!

I was able to do that LIVE and the connection automatically repaired the db logins needed and established connections to the newly attached databases.


Restoring vGPS my Custom App

This is extremely easy as it was a simple matter of checking out my code from the TFS server, do a crtl-h to replace the alias, save and publish.  Testing showed my DB user account wasn’t configured right, so I made the appropriate adjustments and all done.  App works perfectly with the new SQL DNS alias.


Restoring BigFix

This one I take my time on because I really enjoy installing, customizing, and playing around with my companies program.

I created a brand new virtual 2008r2 server.  Thus I had to do the usual build the virtual, install os, patch os, install pre-requisites, and migrate BigFix v8.1.

 


Now that my three services have been restored and validated… the upgrade of my Microsoft SQL 2008r2 SP1 is now complete!

Please leave comments and questions below!