# Tuesday, August 09, 2011

Thursday, August 18, 2011
Developing A Windows Phone Application Using Windows Azure

Subject:
You must register athttps://www.clicktoattend.com/invitation.aspx?code=155740 in order to be admitted to the building and attend.
You are building a Windows Phone application but what would happen when millions of consumers start to use your application? To plan ahead for the huge number of users your application might attract, a more robust architecture must be considered. Windows Azure is exciting platform that empowers you as the developer to off-load the infrastructure needs to the cloud and helps you focus on building an application. In this session, you will learn to build Windows Phone Notepad application that will utilize the WCF RESTful service layer with SQL back-end built in Windows Azure cloud.

Speaker:
Henry Lee, NewAgeSolution.Net
Henry Lee is founder of NewAgeSolution.Net and is passionate about the technology. He is also the author of the book Beginning Windows Phone 7 Development from Apress (http://bit.ly/wp7apress). He works with various Fortune 500 companies delivering mobile applications and rich internet applications. He recently formed start-up company called ToeTapz.com focusing his energy on delivering mobile applications to the consumers. In his spare time, he dedicates his effort to help his .NET community by delivering sessions at the technology events. He enjoys talking with other technologist about current trends in the technology and sharing business insights with fellow colleagues. Often you will find Henry at local cigar bar enjoying a cigar and a drink trying to come up with next big mobile application.

Date:
Thursday, August 18, 2011

Time:
Reception 6:00 PM , Program 6:15 PM

Location: 
Microsoft , 1290 Avenue of the Americas (the AXA building - bet. 51st/52nd Sts.) , 6th floor

Directions:
B/D/F/V to 47th-50th Sts./Rockefeller Ctr
1 to 50th St./Bway
N/R/W to 49th St./7th Ave.

posted on Tuesday, August 09, 2011 4:43:47 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Wednesday, August 11, 2010

Microsoft recently released a CTP of the cloud based SQL Azure management tool, code named “Houston”. Houston was announced last year at the PDC and is a web based version of SQL Management Studio (written in Silverlight 4.0.) If you are using SQL Management Studio, there really is no reason to use Houston, however, having the ability to do web based management is great. You can manage your database from Starbucks without the need for SQL Management Studio. Ok, that may not be a best practice, but hey, we’ve all done it. :)

You can get to Houston here. It will ask you for your credentials, log in using your standard SQL Azure credentials, however for “Login” you have to use the username@server format.

image

I logged in via FireFox and had no problem at all. I was presented with a cube control that allowed me see a snapshot of the settings and usage statistics of my database. I browsed that for a minute and then went straight to the database objects. Houston gives you the ability to work with SQL Azure objects (Tables, Views, and Stored Procedures) and the ability to create, drop, and modify them.

image

I played around with my tables’ DDL and all worked fine. I then decided to play around with the data. I was surprised that you can open a .SQL file off your local disk inside of Houston!

image

I opened up some complex queries that I wrote for Northwind on a local version of SQL Server 2008 R2 and tested it out. The script and code all worked fine, however there was no code formatting that I could figure out (hey, that is ok).

I wanted to test if Houston supported the ability to select a piece of TSQL and only execute that piece of SQL. I was sure it would not work so I tested it with two select statements and got back one result.  (I tried rearranging the statements and only highlighted the second one and it still worked!)  Just to be sure I put in a select and a delete statement and highlighted only the select statement and only that piece of TSQL executed.

image

I then tried two SQL statements and got back two results, so the team clearly anticipated this scenario!

image

All in all I am quite happy with the CTP of Houston.  Take it for a spin yourself.

posted on Wednesday, August 11, 2010 6:16:33 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [1] Trackback
# Thursday, July 01, 2010

With the Q1 release of Telerik OpenAccess ORM, Telerik released a brand new LINQ Implementation and supporting Visual Entity Designer. I have shown in this blog how to connect to SQL Server, MySQL, and how to use the new LINQ with RIA Services. Today I will show you how to connect to SQL Azrue.

To get started, we have to create a new Telerik Domain Model in the server (ASP.NET) project. We’ll create a new Domain Model by right clicking on the server project and selecting “Add” and choosing the Telerik Domain Model from the menu.

In the dialog presented by OpenAccess select the database you want to connect to, for this project choose Microsoft SQL Azure. You also have to manually put in the connection string to SQL Azure in the format of:

Server=tcp:yourSQLAzureDatabaseServer;Database=YourDatabaseName;USER=YourUserID, Password=YourPassword;

clip_image001

Next you have to map your tables to entities. The easiest thing to do is just map all of your tables by selecting the checkbox next to “Tables” in the Choose Database Items dialog and pressing the Finish button.

clip_image002

Visual Studio adds a new Telerik Domain Model to your project.

clip_image003

Now you are free to use the LINQ implementation to build your application. For simplicity, I will drag a gridView control onto the form and then use LINQ to bind all the customers in Germany. The code is here:

 

   1:  protected void Page_Load(object sender, EventArgs e)
   2:  {
   3:      if (IsPostBack==false)
   4:      {
   5:       //data context
   6:       NorthwindEntityDiagrams dat = new NorthwindEntityDiagrams();
   7:       //LINQ Statement
   8:       var result = from c in dat.Customers
   9:                           where c.Country == "Germany"
  10:                           orderby c.CustomerID
  11:                           select c;
  12:      //Databind to the ASP.NET GridView
  13:      GridView1.DataSource = result;
  14:      GridView1.DataBind();
  15:      }
  16:  }
  17:   

The results are show here.

clip_image005

Enjoy!

posted on Thursday, July 01, 2010 2:50:27 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Friday, March 19, 2010

Ever since the “new” SQL Azure went to beta, I have craved an automated way to set up an OData (Astoria) Service from my SQL Azure database. My perfect world would have been to have a checkbox next to each table in my database in the developer portal asking to “Restify” this table as a service. It always seemed kind of silly to have to build a web project, create an Entity Framework model of my SQL Azure database, build a WCF Data Services (OData) service on top of that, and then deploy it to a web host or Windows Azure. (This service seems overkill for Windows Azure.) In addition to all of that extra work, in theory it would not be the most efficient solution since I am introducing a new server to the mix.

At Mix this week and also on the OData team blog, there is an announcement as how to do this very easily. You can go to the SQL Azure labs page and then click on the “OData Service for SQL Azure” tab and enter in your SQL Azure credentials and assign your security and you will be able to access your OData service via this method: https://odata.sqlazurelabs.com/OData.svc/v0.1/<serverName>/<databaseName>

image

I went in and gave it a try. In about 15 seconds I had a working OData feed, no need to build a new web site, build an EDM, build an OData svc, and deploy, it just made it for me automatically. Saved me a lot of time and the hassle (and cost) of deploying a new web site somewhere. Also, since this is all Azure, I would argue that it is more efficient to run this from Microsoft’s server’s than mine: less hops to the SQL Azure database. (At least that is my theory.)

image

To really give this a test drive, I opened up Excel 2010 and used SQL Server PowerPivot. I choose to import from “Data Feeds” and entered in the address for my service. I then imported the Customers, Orders, and Order Details tables and built a simple Pivot Table.

image

This is a great new feature!

image

If you are doing any work with Data Services and SQL Azure today, you need to investigate this new feature. Enjoy!

Technorati Tags: ,

Bookmark and Share
posted on Friday, March 19, 2010 4:40:11 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Tuesday, March 09, 2010

By now there have been a lot of blog posts on Windows Azure billing. I have stayed out of it since I figured that the billing scheme would generate some sticker shock on our end and some rethinking on Microsoft's end. For the most part it has, but I now want to tell my story since I think most early Azure users are thinking along my lines.

When Windows and SQL Azure went live, I wanted to deploy an application using some of Telerik’s products to “production”. I put my free MSDN hours into the Azure system for billing and uploaded the application. I actually could not get it to work and left it up there figuring I would get back to it and fix it later. Periodically I would go in and hoke around with it and eventually fixed it. For the most part I had nothing more than an advanced “Hello World” and simple Northwind data over forms via SQL Azure up there.

Recently, I received a bill for $7 since I went over my free 750 hours by about 65 hours. (I guess I had a test and production account running at the same time for a while.) Even thought for the most part I had no hits other than myself a few times, I still incurred charges since I left my service “live” in production. My bad, I learned a lesson as to how Azure works, luckily, it was only a $7 lesson.

It was then that I realized that I was guilty of treating Windows Azure as a fancy web hosting account. The problem is that Windows Azure is not web hosting, but rather a “web operation system” or a “Cloud” service hosting and service management environment. We’re not only paying for hosting, we are paying for Azure to manage our application for us- much like enterprise middleware 10-15 years ago, but for the cloud. I now look at Azure differently and this is good since I will use it differently (and how it was intended.)  I am guessing that other developers with $7 bills in their inbox this month will do the same.

That said, I was in Redmond a month or two ago and had a chance to talk to the head of MSDN. I complained about how the MSDN subscription offer was only for 8 months, etc. He told me that for the first time in Microsoft’s history, they have hard physical assets that have to be paid for with this service. It is not like if they want to give me a free copy of Windows, it does not cost Microsoft anything except the bandwidth for me to download (which is a fixed cost.) I get that, and I am sure that there will be a cost effective MSDN-Azure “developer only” subscription option in the future. Or at least there should be. :)

Technorati Tags:

Bookmark and Share
posted on Tuesday, March 09, 2010 5:23:54 AM (Eastern Standard Time, UTC-05:00)  #    Comments [2] Trackback
# Friday, January 15, 2010

Have a startup? Want free software? The Microsoft BizSpark Camp is for you. You have to sign up by Monday. See below for more details.

Via Sanjay Jain

BizSpark Camp

With several successful Microsoft BizSpark Incubation Weeks (Azure Atlanta, Win7 Boston, Win7 Reston, CRM Reston, CRM Boston, Win 7 Irvine, Mobility Mountain View,), we are pleased to announce Microsoft BizSpark Camp for Windows Azure in New York, NY during 28–29 January 2010. Based upon your feedback we have made several changes including offering cash prize, compressed time commitment, and much more. We hope you're tapping into the growing BizSpark community.

The current economic downturn is putting many entrepreneurs under increasing pressure, making it critical to find new resources and ways to reduce costs and inefficiencies. Microsoft BizSparkCamp for Windows Azure is designed to offer following assistance to entrepreneurs.

· Chance to win cash prize of $5000

· Nomination for BizSpark One (an invitation only program) for high potential startups

· Learn and build new applications in the cloud or use interoperable services that run on Microsoft infrastructure to extend and enhance your existing applications with help of on-site advisors

· Get entrepreneurs coaching from a panel of industry experts

· Generate marketing buzz for your brand

· Create opportunity to be highlighted at upcoming launch

We are inviting nominations from BizSpark Startups interested in Windows Azure Platform that target one or more of the following:

The Microsoft BizSparkCamp for Windows Azure will be held at Microsoft Technology Center, New York, NY from Thu 1/28/2010 to Fri 1/29/2010. This event consists of ½ day of training, 1 day of active prototype/development time, and ½ day for packaging/finishing and reporting out to a panel of judges for various prizes.

This event is a no-fee event (plan your own travel expenses) and each team can bring 3 participants (1 business and 1 – 2 developer). It is required to have at least 1 developer as part of your team.

To participate in the BizSpark camp, you must submit your team for nomination to Sanjay or your BizSpark Sponsor. Visit Sanjay’s blog for details on how to submit your nomination by Monday, January 18th, 2010. Nominations will be judged according to the strength of the founding team, originality and creativity of the idea, and ability to leverage Windows Azure Scenarios.

You may want to enroll into Microsoft BizSpark, an exciting new offering that enables software startups to leverage Microsoft development and platform technologies to deliver next generation web and Software + Services applications. For details see here.

posted on Friday, January 15, 2010 4:41:18 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Sunday, January 10, 2010

Last week I started to made some predictions on the future of Microsoft development for 2010. I said 2010 will be remembered as a “tipping point” year for three things in the Microsoft developer space. I am not saying that on December 31st, 2010 you will sit back and say I am right on all of these, but I am saying that by December 31st, 2011 or 2012 you will. That said, 2010 will be the tipping point for:

Last week I talked about moving beyond .NET and BI for the masses, today, I will talk about the cloud.

2010: Windows and SQL Azure Launch

January 1 ushered in a  new year as well as the availability of a new service from Microsoft: Azure. With the Windows and SQL Azure services available commercially, one could predict that 2010 will be the year of the cloud. While I don’t think that developers are going to push .NET applications up into the cloud in masse in 2010, I do suspect that adoption will be higher than most people realize, but the ultimate customers who adopt the service may surprise you.

In 2010 the early adopters of Windows and SQL Azure will be large corporation’s “departmental” applications. Sure we will see a bunch of startups begin to use Azure, however, there will only be significant traction after the platform is out a little longer and maybe after a potential billing policy change. Applications built under the radar of the corporate IT department by external consultants and departmental programmers will lead the way, just as Access and Visual Basic did almost 20 years ago.

The reason why is obvious. Business men and women at large companies (companies over 200 people) always groan when they have to deal with their IT department. They think that IT is slow and costly. Azure will be an end run around IT. The cost of Azure is well within the budgets of the folks requesting these applications, actually quite lower than what the internal IT department will charge back to the department to host an application. Windows and SQL Azure will be secure enough to host these non-mission critical, however, extremely important line of business applications. Since almost all of these applications are used at the office and broadband is available at every office, it is a no brainer.

Rewind back 20 years ago and this is how Microsoft conquered the enterprise. (I am not saying that Microsoft “owns” the enterprise, but 20 years ago they had no presence in the back office and were counted out in the the server/enterprise space. Now look: SQL Server, SharePoint, Exchange Server, etc, not to mention Windows Server.) Microsoft used the desktop to extend to the back office. Windows and Office were so popular that developers were willing to give the Trojan horses Access and Visual Basic a try. The rest is history.

So 2010 will be a tipping point year for the Cloud, or at least the Azure platform. Microsoft will gain some market share and also make some mistakes and continue to rev Azure. I can’t predict at this moment when it became a critical mass, but just like with Silverlight and BI, we will all look back one day and say it all started in 2010.

Technorati Tags:
posted on Sunday, January 10, 2010 3:11:33 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Thursday, January 07, 2010

As you may know Telerik has built into most of its products support for Windows and SQL Azure. While at the PDC last November, Ben Riga interviewed me on what it was like to build a commercial product on top of Azure. I give a sneak peak at how we developed the software and how we leveraged SQL Azure. I talk about some of our pain points as well as where it was easy. The video is here, complete with my cell phone’s battery dying in the middle of our interview!

Get Microsoft Silverlight

Enjoy!

Technorati Tags: ,
posted on Thursday, January 07, 2010 3:31:11 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Monday, December 21, 2009

At the PDC Microsoft released the Sync Framework Power Pack for SQL Azure tool. It is a tool built on top of the Sync Framework 2.0 that provides a wizard to keep an on-premises and SQL Azure database in sync as well as a Visual Studio template where you can take a SQL Azure database offline using SQL Compact (or whatever they call it these days.) Lets take a look at the data sync wizard today and soon look at the Visual Studio template.

I started the wizard via Start|Microsoft Sync Framework|SQL Azure Data Sync Tool for SQL Server. The wizard works by taking a database that is on-premises and creating a new one in SQL Azure and creating the SQL Agent jobs to keep them in sync. I could not find out how to sync two identical databases that already exist. To get started you need to provide your SQL Azure credentials and then the name of the new database that you are going to create in SQL Azure.

image

After you provide your credentials for your local server and select a database, you have to select a database to sync with. The next screen shown here allows you to select the tables you want to keep in sync. You also get to choose who wins in a conflict, the local server or SQL Azure.

image

Next you get to choose the order of your sync. This is a very smart feature, you get to say sync in this order to avoid the classic order detail missing order id problem.

image

After you finish up the wizard, it will go ahead and create the new database in SQL Azure for you as well as the tables and associated tracking tables. It will also create the SQL Server Agent jobs required to keep the data in sync. I noticed that on my local server I did not have SQL Agent running, so no sync was performed and after the wizard ran I had no data. So I had to go into SQL Server Agent and enable it and change the default schedule of my job (which was daily at midnight.)

image

You may want to get more creative with the SQL Server Agent schedule depending on your environment, however, the Data Sync wizard is a handy tool to get you started.

Enjoy!

Technorati Tags:
posted on Monday, December 21, 2009 5:59:41 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Saturday, December 19, 2009

While I love Northwind, would do anything for Northwind, I realize that there really is not enough data in Northwind for meaningful demos. We’ve been using Northwind for 15+ years and the reason why we have not completely abandoned Northwind is because AdventureWorks is such crap. Someone at Microsoft took a relational pill one day and over relationalized AdventureWorks to 100th normal form. It takes about 7 joins to get a list of unique customers and their mailing addresses. Microsoft has admitted their mistake and given us “AdventureWorks Lite” a slimmed down version of AdventureWorks with less joins and less pain in the you know what. (Only 3 joins to get the customers and their mailing addresses.) In addition to ADLite, Microsoft has also given us a data warehouse version of AdventureWorks in star schema for BI testing called AdventureWorksDW.

You can now easily load these databases into SQL Azure. The databases were released on CodePlex along with a loading script that you have to run from the commandline. Be careful, the Datawarehouse database is larger and can cause a timeout on a slow connection, forcing you to do it over again.

image

Enjoy!

posted on Saturday, December 19, 2009 9:45:12 PM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Wednesday, December 16, 2009

Far and away the number one Business Intelligence client in the world is Microsoft Excel. While there are tons of data visualization tools out there, Excel is hands down the leader since it is both familiar to users and very powerful. Developers have built tons of business intelligence (BI) apps in Excel using connections to data warehouses (or cubes) and letting the users go crazy with pivot tables and charts.

Things are about to get even better with project Gemini or now  Microsoft SQL Server PowerPivot (we rebel and just call it PowerPivot).  PowerPivot is an add-in for Excel 2010 or Sharepoint. PowerPivot for Excel is a data analyses tool that allows you to connect to a database, download data and store it in a local data engine (VertiPaq) so you can slice and dice it to your heart’s content using familiar Excel tools such as pivot tables and charts. (You can then send it up to SharePoint if you like, but let’s just focus on Excel for now.) PowerPivot works in-memory using your local PC’s processing power and is optimized to handle millions of rows in memory on cheap commodity PCs. The VertiPaq OLAP Engine that is part of PowerPivot  compresses and manages millions of rows of data in memory for you.

There are many awesome features of PowerPivot, but something I learned reading the team’s blog on data importing was that PowerPivot supports SQL Azure natively. This scenario is great since you can download your SQL Azure data, store it locally and slice and dice offline.

Let’s imagine you have this set up in SQL Azure:

  • SQL Azure cloud based online-transaction processing database (OLTP)
  • SQL Azure cloud based online analytical processing data warehouse database  (OLAP)

You don’t have a local server to set up some Analysis services cubes and SQL Azure doesn’t provide that capability. You decide on a hybrid solution and provide a “self-service” distributed OLAP system. Your users, using Excel and PowerPivot, download from the cloud some of the OLAP data from SQL Azure and use their own hardware to do the number crunching. You maybe thinking, “I heard that this VertiPaq engine is great, but how will a laptop handle all of this data/processing?”  Remember if you architect your OLAP database properly, there will be a lot of “pre-crunching” already done, avoiding many segmentations and rollups. I tested about 100 million rows (of old Corzen data) on the “PDC laptop” with 2 GB of RAM and had near instantaneous results.While the cloud is real sexy right now, you can’t perform these type of operations without major latency.

Let’s walk through the basics, just getting some data from SQL Azure into Excel and PowerPivot. First you need Excel 2010 and PowerPivot. You can grab Excel via Microsoft’s web site for free (for now!) since it is in beta and PowerPivot from here. Make sure you download the proper version: x32 or x64.

Once installed, you will see a “PowerPivot” tab in Excel. You can click on on the PowerPivot window icon to get started.

image

The PowerPivot tool is mostly for managing data and connections. You can import data from a variety of data sources, including databases, files,  and services (including RESTful ones.) To connect and download from SQL Azure you have to choose From Database|From Other Sources from the Home tab on the ribbon.

image

This will bring you to a list of available data sources, choose Microsoft SQL Azure.

image

Of course, you will need to log into SQL Azure.

image

Once logged in, you can download data either from a TSQL query or just the tables and views raw. I choose to download just the tables from my Northwind database.

image

There are come cool features like the ability to select one table and then automatically select all related tables. You can also specify some filters to your imports (good if you are segmenting the data by user.) Once you are finished, PowerPivot will now import all of your data for you.

image

Now that your data is in PowerPivot you can play with it very easily. Remember that the data is located on the client in-memory as part of your workbook. (It is compressed.) PowerPivot gives you a tab for each data table you imported and you can go in and in typical Excel fashion, sort and filter, like I did for Barcelona (oh the fond memories of 5 TechEds in Barcelona…..) All of this sorting and filtering is happening in-memory on your machine, you can unplug your network cable if you like, however, you can also refresh your data as you need.  PowerPivot gives the developer several ways to refresh data. Some techniques and guidelines are here.

image

Next you can create some PivotTables, charts, or combination. Just choose PivotTable from the Home menu and then choose a PivotTable type. I will just make a dirt simple Pivot table to give you an idea of some of the interactivity you can build pretty rapidly. (I am also using the OLTP version of Northwind for this walk through, so I did not create any cool queries to work with yet. I’ll do another blog post with some more sophisticated demos after the holidays.)

image

Here is a very simple, yet super powerful PivotTable that I built in 30 seconds using the PowerPivot tools. Of course I just used the raw Order table and my lookups (ShipVia and Employee) are showing their integer value, but work with me. I have a pivot table where I can view the total amount each customer spent on freight in all of our orders,by country/city broken out by the employee who took the order. I can also dynamically filter the data set by the Country and also further filter it by the ShipVia data set.

This creates a very interactive data analysis “report” for your users. (Or they can create it themselves if they are Excel savvy.) In a few seconds the users can see multiple versions of the data, changing the filters (and sorts) on the fly. All shippers in all cities for all employees. No problem. Shipper ID 2 in USA only. Sure. You get the drift. (What is cool is that the users can share this via PowerPivot for SharePoint if they like!)

image

Here is what I did:

I Dragged the ShipCity and CustomerID to the “Row Labels” box to use ShipCity and CustomerID as the main data elements in our rows. I dragged Freight into the Values box to sum on freight for each CustomerID. (Customers have many orders with different freight costs, so freight is a good item to sum up.) I choose Country as a dynamic filter and will break out the Freight totals by EmployeeID (in the Column Labels box). Lastly, I added a Vertical Slicer where I see a slice of all shippers (ShipVia) and can do additional filters on all current data selections.

Pretty easy. Now combine this with some pre-built views on the server and your users can really go to town. All without a persistent connection back up to SQL Azure.

Enjoy!

Technorati Tags: ,
posted on Wednesday, December 16, 2009 5:14:02 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Monday, December 14, 2009

Thursday, December 17, 2009
Application Development for the Windows Azure Platform

Subject: 
You must register at https://www.clicktoattend.com/invitation.aspx?code=143228 in order to be admitted to the building and attend.
At PDC 09 Microsoft did announce commercial availability of the Windows Azure Platform. This presentation will cover the key features of the Windows Azure distributed fabric based operating system that make it possible to build applications leveraging distributed computation and storage capabilities in the cloud. It will also cover higher-layer .NET Services such as SQL Azure, Microsoft’s distributed SQL Server database in the cloud and the .NET Services that provide extended connectivity and security across cloud and on-premise applications.
The focus will be on developing applications using the various platform APIs for development, deployment and management of Windows Azure Platform services.

Speaker: 
Bill Zack, Microsoft
Bill Zack is an Architect Evangelist with Microsoft. He comes to this role after serving as a Solutions Architect in the Financial Services Group of Microsoft Consulting Services. His experience includes developing, supporting and evangelizing .NET/SOA based frameworks used to jump-start development projects for financial services companies. Prior to joining Microsoft he acted as a Consultant, Architect, Administrator, Developer, and System Integrator. He has also authored several computer books and white papers. He is the Founder and President of the New York chapter of the International Association of Software Architects (IASA) and a member of the IASA Board of Directors He is also Co-Moderator of the New York City .NET Developers Group, founder and past president of the New York Enterprise Windows User Group, and the founder and past president of several other computer user groups.

Date: 
Thursday, December 17, 2009

Time: 
Reception 6:00 PM , Program 6:15 PM

Location:  
Microsoft , 1290 Avenue of the Americas (the AXA building - bet. 51st/52nd Sts.) , 6th floor

Directions:
B/D/F/V to 47th-50th Sts./Rockefeller Ctr
1 to 50th St./Bway
N/R/W to 49th St./7th Ave.

Technorati Tags:
posted on Monday, December 14, 2009 3:38:48 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Wednesday, December 02, 2009

Yesterday I showed how to use Telerik OpenAccess, Telerik Reporting, and SQL Azure to create reports and view them in ASP.NET. Today i will show how to view that same report in Silverlight using the industry’s first native Silverlight report viewer. All of this is in the Telerik documentation, however, since we have our cool SQL Azure demo already up and running from yesterday, I figured it would be fun to show how to reuse the same report and view it in Silverlight.

Getting Started

Remember from yesterday that we have three projects in our solution:

  • Telerik.Reporting.DAL-the class library project containing our OpenAccess entities (mapped back to SQL Azure tables)
  • Telerik.Reporting.RptLib-the class library project containing our Telerik report using the DAL project as a data source
  • Telerik.Reporting.Web-the ASP.NET application containing our Report Viewer control, allowing the users to view the report

Now we will add a Silverlight project named Telerik.Reporting.SL that will use the Telerik.Reporting.Web application as its ASP.NET host.

image

The Telerik Reporting WCF Service

Since all data access in Silverlight has to be an asynchronous service, we have to send our report down to our Silverlight application as a WCF service. At the end of the day a Telerik Report is just a plain old .NET class, so we can easily send it down via WCF. The good news is that a lot of the WCF plumbing has been taken care of for you by the Telerik Reporting service. Let’s get started.

First we have to set a reference in our ASP.NET host application (Telerik.Reporting.Web) to Telerik.Reporting.Service. Next add a WCF service to your project, named ReportService.svc. This will also automatically add a reference to System.ServiceModel for you.

Now delete everything in the SVC file and replace it with this:

<%@ServiceHost Service="Telerik.Reporting.Service.ReportService, Telerik.Reporting.Service, 
Version=3.2.9.1113, Culture=neutral, PublicKeyToken=A9D7983DFCC261BE" %>

Note: you have to make sure that the version number you are using is the same as mine. You can get your version number by clicking on the Telerik.Reporting.Service reference in the solution explorer and looking at the version property.

image

Next you will have to add the WCF endpoint to the configuration section of the web.config of the same ASP.NET project that has the ReportService.svc WCF service. (You can copy this code to paste as is into your application from the Report Service support page.)

 
 <system.serviceModel>
   <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />
   <services>
     <service name="Telerik.Reporting.Service.ReportService" 
        behaviorConfiguration="ReportServiceBehavior">
      <endpoint address="" binding="basicHttpBinding"
        contract="Telerik.Reporting.Service.IReportService">
         <identity>
           <dns value="localhost" />
         </identity>
       </endpoint>
       <endpoint address="resources" 
          binding="webHttpBinding"
          behaviorConfiguration="WebBehavior" 
          contract="Telerik.Reporting.Service.IResourceService"/>
       <endpoint address="mex" binding="mexHttpBinding" 
          contract="IMetadataExchange" />
     </service>
   </services>
   <behaviors>
     <serviceBehaviors>
       <behavior name="ReportServiceBehavior">
         <serviceMetadata httpGetEnabled="true" />
         <serviceDebug includeExceptionDetailInFaults="false" />
       </behavior>
     </serviceBehaviors>
     <endpointBehaviors>
       <behavior name="WebBehavior">
         <webHttp />
       </behavior>
     </endpointBehaviors>
   </behaviors>
 </system.serviceModel>

 

That is it for the Telerik WCF Reporting Service. Now let’s add a ReportViewer to our Silverlight application.

Using the Telerik Silverlight Report Viewer Control

The Telerik Silverlight Report Viewer offers native report viewing support for Silverlight. It uses the same rendering engine and offers the same functionalities as the other Telerik viewers, so your report will render the same in Silverlight as in Windows Forms as ASP.NET. To get started, you need to add a reference to the Telerik Silverlight controls in your Telerik.Reporting.SL project:

  • Telerik.Windows.Controls.dll
  • Telerik.Windows.Controls.Input.dll
  • Telerik.Windows.Controls.Navigation.dll

These controls are located in the following folder:

Program Files\Telerik\RadControls for Silverlight Q3 2009\Binaries\Silverlight\

In addition, you need to add a reference to the ReportViewer:

  • Telerik.ReportViewer.Silverlight.dll

This library is located in Program Files\Telerik\Reporting Q3 2009\Bin\

Lastly, add a reference to our report project: Telerik. Report.RptLib.

Now you can add the report viewer namespace and control to your XAML:

   1:  <UserControl
   2:      xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" 
   3:      xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
   4:      xmlns:d="http://schemas.microsoft.com/expression/blend/2008" 
   5:      xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" 
   6:      mc:Ignorable="d" 
   7:      xmlns:Telerik_ReportViewer_Silverlight=
   8:       "clr-namespace:Telerik.ReportViewer.Silverlight;
   9:        assembly=Telerik.ReportViewer.Silverlight" 
  10:      x:Class="Telerik.Reporting.SL.MainPage"
  11:      d:DesignWidth="640" 
  12:      d:DesignHeight="480">
  13:    <Grid x:Name="LayoutRoot">
  14:        <Telerik_ReportViewer_Silverlight:ReportViewer 
  15:          Margin="0,0,20,8" 
  16:          ReportServerUri="../ReportService.svc" 
  17:          Report="Telerik.Reporting.RptLib.CustomerRpt, 
  18:              Telerik.Reporting.RptLib, Version=1.0.0.0, 
  19:              Culture=neutral, 
  20:              PublicKeyToken=null"/>                                                  
  21:    </Grid>
  22:  </UserControl>

 

Note: The Telerik support page for the Silverlight Report Viewer is a good reference for the code needed and you can copy and paste from there instead of from here. Alternatively, you can use Expression Blend to add the report to your XAML form and you don’t have to worry about the namespaces and XAML for the ReportViewer, Blend will take care of that for you, all you have to worry about are the ReportServiceUri and Report properties. If you are not using Blend, just make sure that the properties are in the correct order in the XAML (ReportServiceUri first, Report second) or you will get some nutty bugs. (Ask me how I know this!)

You are going to have to set two properties of the ReportViewer control (either manually in XAML or in Blend):


  • ReportServiceUri: or the absolute or relative path back to the WCF service in the ASP.NET solution
  • Report: or the assembly qualified name of your report class

<Sidebar>

How to find the assembly qualified name of your report? I was confused at first too. There are some PowerShell tools, etc, but I am PowerShell challenged. Instead, just add a label to an ASP.NET WebForm in the Telerik.Reporting.Web project and on the page load event add this code:

   1:  Type ty = typeof(Telerik.Reporting.RptLib.CustomerRpt);
   2:  Label1.Text = ty.AssemblyQualifiedName;

Then run the page in the browser and copy and paste the fully qualified assembly name into your XAML as I did. :)

</Sidebar>

That is it. Next step is to set your Slverlight test page as startup and F5. The ReportViewer supports native printing and export capabilities to PDF, XLS, Excel, etc. Pretty cool.

image

Enjoy!

Technorati Tags:
posted on Wednesday, December 02, 2009 7:20:54 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Friday, November 13, 2009

Earlier this week, Microsoft released SQL Server 2008 R2 November CTP (Microsoft needs to hire some people from Apple to do something about the naming of their products.) I learned while at TechEd Europe 2009 that the version of SQL Server Management Studio (SSMS) that ships with R2 CTP3 has support for SQL Azure. So I downloaded the CTP (I had to do it in the speaker room, the speaker hotel blocks just about everything on its “free” wifi) and got rolling.

After installing, I decided to play around with the SQL Azure support. I was able to put my login credentials right into the SQL Server connection dialog and did not need to specify a database in the “Options” menu like I had to do before.

image

I was able to play around with SSMS and Azure and create databases tables, views, etc (via TSQL, not dialogs). Most importantly, SSMS does support the Object Explorer for SQL Azure!  We can’t change the database as we can in other community tools, however, this is still a beta and it has come a long way so far. SSMS supports most of the functionality that you will need to get started with SQL Azure.

image

Enjoy!

Technorati Tags:
posted on Friday, November 13, 2009 5:02:34 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Thursday, November 12, 2009

Just a few weeks ago Microsoft released Visual Studio 2010 Beta2. Last week Telerik put its Q3 release live into production. One of the cool new Q3 features is that OpenAccess now works seamlessly with Visual Studio 2010. That means you can target .NET 3.5 or .NET 4.0 using either Visual Studio 2008 (.NET 3.5) or Visual Studio 2010 (.NET 4.0).

I will do a quick demo with Visual Studio 2010, SQL Azure, and OpenAccess. With OpenAccess Q3 installed, I fired up Visual Studio 2010 and started a console project targeting .NET 4.0

image

While the project will target the .NET 4.0 Framework, we have to do one small thing to make it work. By default the project type is “.NET 4.0 Client Profile” so we have to change that to a straight up .NET 4.0 project type. The way to do this is to right click on the project and select properties. In the properties dialog Application section, you will see Target framework;  select .NET 4.0 and you are good to go. (Visual Studio will have to close and reopen the project for you.)

image

Next we have to fire up OpenAccess via the Enable Project Wizard. When I start the Enable Project to use ORM Wizard, OpenAccess asked me what database to use, and as I showed before on this blog, Q3 now supports SQL Azure natively. Notice that the wizard will prompt you to put in your SQL Azure credentials and will give you the basic template for your server name: tcp:<sqlazureid>.database.windows.net.

image

Note: Depending on your setup in Visual Studio 2010, you may have to use the Server name without the tcp: and use the syntax UserName@sqlazureid. Visual Studio 2010 will give you an error in your setup if the default does not work. If you get this error you would enter the following for your SQL Azure credentials:

Server Name: sqlazureid.database.net (no tcp:, so for example p28drog84.database.net)
User Name: YourSQLAzureUserID@sqlazureid (for example: Stevef@p28drog84)

Next you will want to map some SQL Azure tables to OpenAccess entities. This can be done pretty easily, just by running the Reverse Mapping wizard. Here you can select your tables to map. By default OpenAccess will also now map the foreign keys of each entity as a primitive type in addition to the complex type. This will help a great amount if you are using your entities in conjunction with any data service such as WCF or ADO .NET Data Services. (More on that later.)

image

Once you have mapped your entities, you are free to work with them. You can use the OpenAccess LINQ implementation (which has went through a major overhaul and is in line with the LINQ to SQL and LINQ to Entities LINQ implementations.) As I showed last time, you can write a simple LINQ statement to filter all the Customers by a certain country as shown here:

   1:  static void Main(string[] args)
   2:  {
   3:      IObjectScope dat = ObjectScopeProvider1.GetNewObjectScope();
   4:      //LINQ Statement   
   5:      var result = from c in dat.Extent<Customer>()
   6:                   where c.Country == "Germany"
   7:                   orderby c.CompanyName
   8:                   select c;
   9:      //Print out the company name  
  10:      foreach (var cust in result)
  11:      {
  12:          Console.WriteLine("Company Name: " + cust.CompanyName);
  13:      }
  14:      //keep the console window open  
  15:      Console.Read();
  16:  }

 

image

Enjoy!

posted on Thursday, November 12, 2009 11:20:25 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Monday, November 02, 2009

Microsoft’s SQL Azure database offering has great 3rd party support. This week Telerik is releasing its Q3 version of its entire product line and the OpenAccess ORM will have more robust and native support for SQL Azure over what is currently available. I will expand on the example I did last week on connecting to SQL Azure by showing how to work with OpenAccess entities via WCF in a Silverlight application.

The fist thing that you have to do is create a project for your data access layer and connect to SQL Azure. I started a new class library and used the Enable Project to use ORM wizard. This is where you can specify your SQL Azure user account and credentials. I showed how to do this last week, so I will skip the steps here. (This is a new feature of Q3, in the previous version of OpenAccess, you had to use the SQL Server provider, now OpenAccess supports SQL Azure natively!)

Next we have to create a project to contain our WCF service. What we have to do next is point the Telerik Data Service Wizard to the DAL project and have it automatically create the SVC and CS files of our service for us.  The wizard will automatically create all of the CRUD methods for our entities. (In this demo I only used Customers.) In case you have not used the wizard yet, here is a walk through video on how to do that.

Telerik OpenAccess WCF Wizard Part I from Stephen Forte on Vimeo.

Now we will have two projects, one for our DAL and one for our WCF service. Now, add a Silverlight project and your solution should look like this, four projects: the DAL project, the WCF service project, the Silverlight Web project and the Silverlight project itself.

image

Next up we set a service reference to our WCF service and call the ReadCustomers method to get a list of all the customers and bind it to a XAML grid. (Remember that this being Silverlight, we have to do it all asynchronously.) We do this inside of a LoadData method in our form.

   1:  private void LoadData()
   2:  {
   3:      //ref to our service proxy
   4:      SampleWCFServiceClient wcf = new SampleWCFServiceClient();
   5:      //register the event handler-can move this up if you want
   6:      wcf.ReadCustomersCompleted += ReadCustomersCompleted;
   7:      //make an async call to ReadCustomer method of our WCF service
   8:      //get only the first 100 records (default)
   9:      wcf.ReadCustomersAsync(0, 100);
  10:  }

The first thing that we do in the code above is create a reference to our WCF service in line 4 and then register an event handler to catch the asynchronous completion of the event on line 6. On line 9 we make the (asynchronous) call to ReadCustomers(). Since ReadCustomers() will process asynchronously, we will have to go to the ReadCustomersCompleted() method to catch the event. Let’s look at that here:

   1:   
   2:  void ReadCustomersCompleted(object sender, ReadCustomersCompletedEventArgs e)
   3:  {
   4:      //if the filter is set use a LINQ statement
   5:      //this can also be done on the server via the service
   6:      if (CheckFilter.IsChecked == true)
   7:      {
   8:          var filter = from c in e.Result
   9:                       where c.Country == "Germany"
  10:                       select c;
  11:   
  12:          dataGridCustomers.ItemsSource = filter;
  13:      }
  14:      else
  15:      {
  16:          dataGridCustomers.ItemsSource = e.Result;
  17:      }
  18:  }

In ReadCustomersCompleted we are doing seeing if a checkbox is checked and if so, we do some client side LINQ statements to filter on the client for only customers in Germany. (This is a holdover from a demo I did at BASTA in Germany, of course you should move your countries to a drop down list and then filter with a parameter! Better yet, filter via the WCF service on the server!) If the checkbox is not checked, we will just show all of the customers.

image

If you want to edit a customer (or add, etc), the Silverlight grid allows you to do this inside the grid itself. However, you have to make sure that all of your dirty records are recorded so you only send back the dirty records to your backend WCF service. (Why bother updating all of the records?)

Here is the code to build the collection on the Begin Edit of the grid. This code just adds the current customer object into our custom collection (editedCustomers) so we can loop through it later on if we are doing an update.

   1:  void dataGridCustomers_BeginningEdit(object sender,
   2:   DataGridBeginningEditEventArgs e)
   3:  {
   4:      //build a list of Customer that are dirty
   5:      Customer customer = e.Row.DataContext as NorthwindWCFService.Customer;
   6:   
   7:      if (!editedCustomers.Contains(customer))
   8:      {
   9:          editedCustomers.Add(customer);
  10:      }
  11:  }

 

Now that we have our collection of dirty customers, we have to deal with the save button. The code below is run when the user clicks on the save button, saving all dirty records. Line 6 sets up the WCF service via the proxy and line 8 registers the event. Lines 11-15 is a loop of all of the dirty customers. (I get them via the custom collection editedCustomers shown above.) Inside of the loop on line 14 we make the actual asynchronous call to the WCF service’s UpdateCustomer method passing in the object and its correct ID. While the UpdateCusotmerCompleted event will fire (since this method is called asynchronously) when the update is complete, we have nothing really in that method except some cleanup of our custom collection and a message box to the users that the update is complete.

   1:  void ButtonSave_Click(object sender, RoutedEventArgs e)
   2:  {
   3:      
   4:      //the WCF service
   5:      //ref to our service proxy
   6:      SampleWCFServiceClient client = new SampleWCFServiceClient();
   7:      //register the event handler-can move this up if you want
   8:      client.UpdateCustomerCompleted += UpdateCustomerCompleted;
   9:   
  10:      //save only the dirty customers
  11:      foreach (NorthwindWCFService.Customer customer in editedCustomers)
  12:      {
  13:          //call the WCF method async to update the customer
  14:          client.UpdateCustomerAsync(customer.CustomerID.ToString(), customer);
  15:      }
  16:   
  17:  }

That is all there is too it! An add or delete is done in the same way.

Enjoy!

Technorati Tags: ,,,
posted on Monday, November 02, 2009 5:13:59 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Friday, October 30, 2009

With the official release of SQL Azure less than three weeks away, we are starting to see mainstream vendor support for SQL Azure. Telerik’s OpenAccess ORM is no exception. With the Q3 release of OpenAccess next week, OpenAccess will have full support for SQL Azure, going further than the basic support available today that I demonstrated on my blog last month. Full wizard support, forward and reverse mapping, and of course data services support via the Telerik Data Services Wizard. Let’s take a look at the basics here.

Getting Started

To get up and running and show the mapping and LINQ support, I will open Visual Studio 2008 (or 2010) and create a simple Console Application named OpenAccess.Azure.Demo. The first step is to enable the project to use OpenAccess via the Enable Project Wizard. As part of the wizard you need to specify which database you are going to connect to. OpenAccess gives you many choices besides Microsoft SQL Server, and one of the native choices is Microsoft SQL Azure. After you select SQL Azure, you will need to provide your credentials (don’t forget to put the tcp: in front of your server name.)

image

After you connect and finish the wizard, it is time to do some mapping.

Mapping Database Objects

To map some database objects to persistent classes, choose the Reverse Mapping wizard. This will bring up the Reverse Mapping dialog where you can select which tables, views, and stored procedures you want to map. In this case I will just select all of the defaults and map all of Northwind (remember I migrated Northwind up to SQL Azure.) Now it is time to build a simple application.

image

Working with SQL Azure Data

Let’s write a LINQ statement to fetch all of the customers from one country and print it out to the console window. First tings first, you have to put in a using statement for OpenAccess:

using Telerik.OpenAccess;

Next we will create our LINQ statement. The LINQ statement will work like any LINQ statement in OpenAccess, there is nothing special for SQL Azure, this LINQ statement would work against SQL Server, MySQL, or Oracle.

   1:  static void Main(string[] args)
   2:  {
   3:      //data context
   4:      IObjectScope dat = ObjectScopeProvider1.GetNewObjectScope();
   5:      //LINQ Statement
   6:      var result = from c in dat.Extent<Customer>()
   7:                   where c.Country == "Germany"
   8:                   select c;
   9:      //Print out the company name
  10:      foreach (var cust in result)
  11:      {
  12:          Console.WriteLine("Company Name: " + cust.CompanyName);
  13:      }
  14:      //keep the console window open
  15:      Console.Read();
  16:  }

The LINQ statement uses the data context, or IObjectScope in line 4, has a simple LINQ statement on lines 6-8 to filter by the customers in Germany and then iterates those customers and prints them out to the console window in lines 10-13. The result is shown here:

image

Pretty basic application, however, you can see that Telerik OpenAccess has full support for SQL Azure. Next week I will show a more complete example.

Enjoy!

Technorati Tags: ,
posted on Friday, October 30, 2009 3:47:58 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Wednesday, October 28, 2009

The developer ecosystem is starting to rally around SQL Azure, and that is a healthy thing. I stumbled across this neat SQL Azure query tool from Microguru today. It is called the Gem Query Tool for SQL Azure and you can download it here. It uses the .NET Framework 3.51 and is pretty lightweight. The cool thing is that it has a community edition that is free to use.

According to Micoguru the tool:

provides an intuitive user interface to connect to and work with SQL Azure databases. Gem Query Tool supports execution of any DDL and DML script supported by SQL Azure. To facilitate authoring of SQL queries, Gem Query Tool for SQL Azure displays tables and columns in your database.

I gave it a go today and it is simple to use. What stands out about this tool and why I highlight it after I highlighted a few other tools is the database browse schema feature. This is a basic feature that SQL Server’s own Management tool and the SQL Azure web site both lack. Gem Query allows you to log into one database (my only complaint is that there is not a “browse databases” feature) and select your tables or views. Once you click on one you can view the schema in a nice grid:

image

Access to this metadata is important after you have migrated some data since some of the data types or defaults may not be what you expect. I also like that when you click on a database object you have a “column” or “data” view. When you click on the “data” tab, you will have the option to select the Top 100, Top 1000 (pretty convenient) and All.

image

Gem Query has a very easy to use Query interface. While most other tools leave you on your own to write SQL, Gem Query gives you three windows with the SQL keywords supported by SQL Azure and a list of row returning database objects (tables, views, etc..) When you select a SQL command, it will put it into the query window for you, as it will for the objects. When you select an object you will also see the available columns and click on those to build your query. I wrote this very simple query using the builder:

 

   1:  SELECT CustomerID, ContactName, Country
   2:  FROM Customers 
   3:  ORDER BY Country 

image

 

Enjoy!

Technorati Tags:
posted on Wednesday, October 28, 2009 6:53:55 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Friday, October 23, 2009

As you most likely know by now, SQL Azure is now on production servers and the team is feature complete for the PDC build. What is funny is that they did a migration in the middle of the night on Tuesday, right smack dab in the middle of the day in Bulgaria where I was demoing SQL Azure at DevReach. That lead to some funny demos crashing. What fun!

The good news is that SQL Azure now supports BCP, so you should be able to easier bulk load data on into SQL Azure. No more crazy work around like the ones I posted on this blog and on many others.

Another new feature, which took me a while to discover when all of my unit tests failed, is a firewall. You can now specify which IP Addresses can access your SQL Azure account. At first this kind of pissed me off, but I like that I can completely control who can access my account. (Especially since Microsoft will start charging soon for SQL Azure.) To set your allowed IP Addresses, just click on the “Firewall” tab in the SQL Azure manager and add a range of IP Addresses and you will be in good shape.

image

Anyway, SQL Azure support is improved in SQL Server Management Studio, however, you still can’t use the object explorer. I reviewed SQL Azure manager a week or two ago, and that still works as advertised against the new production builds.

image

If you are working with Visual Studio 2010 beta2 that shipped this week, you can use a refreshed version of the SQL Azure Explorer. The 0.2.2 version is built specifically for VS 2010 b2 and is very stable, the author on codeplex marked it now as a beta. After you download and install SQL Azure Explorer, you have to go into Visual Studio and enable extensions. You can do this via Tools|Extension Manager. You will see that SQL Azure Explorer is disabled and extensions are not available. You have to click on the hyperlink and then check the allow button in the dialog that follows. After you close and reopen Visual Studio, you will see SQL Azure Manager as the first item in the Tools menu.

image

There have been some major speed improvements to the login and object view process. Also the ability to create and drop databases, users, etc. A nice feature (that was also in the last version but I did not see it) is that you can parse your query and even format your TSQL. Pretty cool.

image

Technorati Tags:
posted on Friday, October 23, 2009 4:15:02 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Friday, October 09, 2009

Here is a another pretty cool find for managing SQL Azure databases. The SQL Azure Explorer for Visual Studio 2010 Beta 1 is a neat little add-in that allows you to explore your SQL Azure databases as well as run SQL statements.

Once you have the tool installed, just fire up Visual Studio Beta 1 (I assume when Beta 2 ships the tool will be updated) and you can access the tool from the Tools|SQL Azure Explorer menu option. You have to provide your SQL Azure login credentials (you may have to put as the login your username@servername)

image

After you have logged in, you will see the Object Explorer tree. You can drill into the database objects or select the top 100 rows of a table or view.

image

You can also right click on the database and say “Open SQL Editor” and type in SQL Statements against your SQL Azure database.

image

If you are using Visual Studio 2010 beta 1 and spend a lot of time in Visual Studio, you may want to give the SQL Azure Explorer a try since you won’t have to leave Visual Studio.

Enjoy!

Technorati Tags:
posted on Friday, October 09, 2009 4:15:54 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Thursday, October 08, 2009

Last month I showed how to manually migrate schema and data from SQL Server 2008 to SQL Azure. I also mentioned the SQL Azure Migration Wizard developed by George Huey in that post. Since then George has updated the tool (currently on 0.2.7) and it now has some great features that make it far more compelling than creating and fixing scripts generated by SQL Server.

To get started, the tool will allow you to select a SQL Server database and then ask you to select which objects to migrate.

image

It will then do an analysis of your database and attempt to fix the problems. For example, if you have a table with a TEXT column, it will automatically make it a varchar(max) and will also unbind XML schemas and UDTs. It will remove keywords (like PAD_INDEX as shown below) not supported by SQL Azure.

image

After you connect to SQL Azure, the tool makes a script for you. You have the option to run the script as is, or you can save it to disk, or you can modify it in the window and run it in real time. The scripts have good comments in there telling you about some potential problems like XML data types, etc. After you play with the script, you can execute it and the tool will let you know the success or failure of the script.

image

Now that you have migrated the schema, the hard part is over. Next you can migrate the data. The easiest way to do this so far is via Red Gate SQL Compare as I showed last time or via scripting the data via the SQL Server script wizard. To get there right click on a database in SQL Server 2008’s SQL Server Management Studio and select Tasks|Generate Scripts. After you select your database you are brought to the Choose Script Options dialog. Turn off all of the other options except “Script Data” and choose on the next screen only the tables you have moved using the SQL Azure Migration Wizard.

image

After you choose the tables to move over, you can then choose to send the script to a new window. You will notice that the only thing that was scripted was the INSERT INTO statements.

image

Now select the tab that contains your query and then choose Query|Connection|Change Connection from the main menu. Enter in the login credentials you have for your SQL Azure account and then click on the Options tab and enter in the name of the database you are connecting to. After you connect you can attempt to run your query. Just make sure that your script will be compatible with SQL Azure before you run it. Meaning if you removed any XML data types, etc, using the Migration tool, you will have to do the same to your script. There is always some cleanup to run, but is pretty straight forward. For really large databases, you may want to highlight sections of the script and run it a table or so at a time to prevents timeouts to SQL Azure. You will also have to make sure that you arrange the script to sequence the INSERTs to coincide with the foreign keys constraints. SQL Server is smart enough to put Order before Order Details, but not all objects does it do this for you.

image

When you are done you can view your database using the SQL Azure Manager I talked about yesterday.

image

Enjoy!

Technorati Tags:
posted on Thursday, October 08, 2009 5:52:46 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Wednesday, October 07, 2009

As I showed on this blog a few weeks ago, SQL Server Management Studio support for SQL Azure leaves a lot to be desired. No changing databases, no object explorer, and no easy way to connect. Good thing that Maarten Balliauw and Juliën Hanssens built the SQL Azure manager. It is available here. It is listed as a proof of concept and “highly unstable” but after a few days of playing around with it, I have encountered almost no problems. The tool is available only as a click once application that runs online with no install, so you can try it out risk free.

Maarten and Julien describe the tool as: SQL Azure Manager is a community effort to quickly enable connecting to your SQL Azure database(s) and perform basic tasks…

That is an understatement, the tool is great and saves lots of time. After the brief install, I was promoted with the login screen and I put in my login credentials and was off to the races.

image

The first thing you notice is that out of the box we have two much needed features, the ability to change databases and a full fledged object explorer. You can expand to see Tables and Views, but not Stored Procedures.

image

I have a stored procedure in this database and can’t see it in the Object Explorer, so I decided to make my fist query a simple one:

select * from sys.objects
Where type='P'

A pretty awesome feature is that you can hit F5 just like you would in SSMS and get the results as you can see here. I was able to write several TSQL queries and browse the results without any problem. When you right click on a table it will give you the option to Select the top 100, Select All, or delete the object. Pretty cool.

image

Another cool feature is that you can drag a column up to the top to automatically sort. Here I sort by City in the Customers table.

image

My only issue was that after a while, the tool times out, I suspect that this is a SQL Azure thing.

The tool makes it very easy to delete objects and databases as well as create new database. (Careful with the delete database feature!!!)

While this is a 0.1.7 release, it is worth playing with. It will save you lots of time when managing your SQL Azure databases. The roadmap to a 1.0 release is to convert to WPF, move the app to codeplex, and make it more stable.

Thanks to my colleague Aditya Pendharkar at Telerik for finding this gem of a tool.

Technorati Tags:
posted on Wednesday, October 07, 2009 3:44:35 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Friday, September 18, 2009

Over the past few weeks I have showed how to use Telerik OpenAccess with the WCF Wizard. I think that this wizard is a crucial piece of technology since I hate to write plumbing code myself, it takes me too long and I usually make mistakes.

I wish that SQL Azure would provide RESTful and WCF services wrapped around your database tables and views with just a check box in the Azure management page. But alas, I can dream. Until that day arrives, you will have to code the services yourself. So I decided to do it with OpenAccess and WCF (and of course the Wizard.)

First you need to get some data up in SQL Azure. Refer to my post from a few weeks ago as how to do that. Next you have to create a data access layer with Telerik OpenAccess. But you have to create your DAL against a local SQL Server 2005/2008 database with the same schema as the one up in SQL Azure. You have to do this because SQL Azure does not support querying of the schema. After you map your tables to classes (I will do just Customers for this demo), you have to go in and change the connection string in your DAL’s app.config to use SQL Azure instead of the database you used for the mapping:

   1:    <connection id="Connection1">
   2:      <databasename>Northwind_Lite</databasename>
   3:      <servername>tcp:tpzlfbclx1234.ctp.database.windows.net</servername>
   4:      <integratedSecurity>False</integratedSecurity>
   5:      <backendconfigurationname>mssqlConfiguration</backendconfigurationname>
   6:      <user>Stevef</user>
   7:      <password>gomets!</password>
   8:    </connection>

Now you have to create the WCF service via the wizard (if you forgot how to do that, watch this video). This will be done in a separate project to achieve a full separation of concerns.

image

After you create the WCF service via the Wizard, you can go ahead and create a Silverlight client. Your solution set up should consist of a DAL project, a WCF service project, a Silverlight web project, and a Silverlight client project.

image

 

Ok, XAML time. (In my head I am saying that to the tune of Hammer Time, but I digress….)  I will create a grid that will bind to the CompanyName, ContactName, City, and Phone fields of the Customer table. The grid will do most of the magic for us. I will also add a “Refresh” button as well as a “Save” button. The Refresh button will have the same LoadData() method as in all of my previous blog posts. We’ll talk about Save in a minute.

   1:  <data:DataGrid x:Name="dataGridCustomers" Grid.ColumnSpan="4" ItemsSource="{Binding}">
   2:      <data:DataGrid.Columns>
   3:          <data:DataGridTextColumn Binding="{Binding Path=CompanyName}"
   4:  Header="Company Name"></data:DataGridTextColumn>
   5:          <data:DataGridTextColumn Binding="{Binding Path=ContactName}" 
   6:  Header="Contact Name"></data:DataGridTextColumn>
   7:          <data:DataGridTextColumn Binding="{Binding Path=City}"
   8:   Header="City"></data:DataGridTextColumn>
   9:          <data:DataGridTextColumn Binding="{Binding Path=Phone}"
  10:   Header="Phone"></data:DataGridTextColumn>
  11:      </data:DataGrid.Columns>
  12:  </data:DataGrid>

 

If we run our application, you can see that the grid works as advertised, fetching data from SQL Azure.

image

I’ll go in and edit the city for the first record to say “Hong Kong.” In order to facilitate this, we need to handle the BeginEdit event of the grid. During this event handler shown below, we will stuff a Customer object into our own private collection so we know that the entity is dirty. (If we don’t do this, we won’t know which items are dirty and would have to update all of them, a big waste of resources.) I do this on line 9 (after a check to see if it already in my collection.)

   1:  void dataGridCustomers_BeginningEdit(object sender,
   2:                    DataGridBeginningEditEventArgs e)
   3:  {
   4:      //build a list of Customer that are dirty
   5:      Customer customer = e.Row.DataContext as NorthwindWCFService.Customer;
   6:   
   7:      if (!editedCustomers.Contains(customer))
   8:      {
   9:          editedCustomers.Add(customer);
  10:      }
  11:  }

Now we have to handle the button click. (When the user hits save.) The user can hit save after editing the entire page (or after each row if they prefer, but find me a user who wants to do that.) The code below calls the WCF service we created with the wizard asynchronously. In this case we call the service for each customer in our collection (loop is from lines 8-111, the call to the WCF service is online 10).

 
   1:   
   2:  void ButtonSave_Click(object sender, RoutedEventArgs e)
   3:  {
   4:      SampleWCFServiceClient client = new SampleWCFServiceClient();
   5:   
   6:      client.UpdateCustomerCompleted += UpdateCustomerCompleted;
   7:      //save only the dirty customers
   8:      foreach (NorthwindWCFService.Customer customer in editedCustomers)
   9:      {
  10:          client.UpdateCustomerAsync(customer.CustomerID.ToString(), customer);
  11:      }
  12:   
  13:  }

 

In the code directly above, we registered an event, UpdateCustomerCompleted, to fire when each Customer is done updating. In theory we don’t need to do anything as far as the data is concerned, however, we have to clean up our collection of dirty Customers (line 8) as well as set an internal counter to 0 (line 7). This will give us a clean slate when we start editing again. We will also use the opportunity to show a message box (line 10) to the user that the data was updated. (Yea, yea I know I need error handling, etc. This is a demo!! :) ) We do this clean up only after all Customers have been edited and their async calls have been caught. We do this with our counter (line 3 and 5), comparing our count to the number of dirty records. Old school, but effective.

   1:  void UpdateCustomerCompleted(object sender, UpdateCustomerCompletedEventArgs e)
   2:  {
   3:      this.updatedCount++;
   4:   
   5:      if (updatedCount == editedCustomers.Count)
   6:      {
   7:          updatedCount = 0;
   8:          editedCustomers.Clear();
   9:   
  10:          MessageBox.Show("All Customers have been updated successfully!",
  11:                                                "Updating Data", MessageBoxButton.OK);
  12:      }
  13:  }

 

You can see the results here.

image

So let’s do one last thing. Let’s add a client side filter. This will be a dirt simple one, using LINQ to Objects to filter for only customers in Germany. (Of course the filter should be dynamic, etc. Also you may want to move this functionality to the server via your WCF Service.)

In the XAML page, I provide a checkbox that says “Show only Germany.” When this is checked we will filter our results with a LINQ statement. When it is unchecked, we will show all the records.

image

Our LoadData() method calls the WCF service and registered an event, ReadCustomersCompleted. LoadData() is called on the page load, refresh button click, and check box click events.

In past demos, this just took the result and assigned it to the ItemSource property of the grid. In this case we will check to see if our filter check box is checked (Line 6) and if so, we will perform a LINQ query (Lines 8-10) and assign the result to the gird (Line 12). If the check box is not checked, we perform no filter (line 16).

 

   1:  void ReadCustomersCompleted(object sender, 
   2:                   ReadCustomersCompletedEventArgs e)
   3:  {
   4:      //if the filter is set use a LINQ statement
   5:      //this can also be done on the server via the service
   6:      if (CheckFilter.IsChecked == true)
   7:      {
   8:          var filter = from c in e.Result
   9:                       where c.Country == "Germany"
  10:                       select c;
  11:   
  12:          dataGridCustomers.ItemsSource = filter;
  13:      }
  14:      else
  15:      {
  16:          dataGridCustomers.ItemsSource = e.Result;
  17:      }
  18:  }

 

That is it! Of course you can create nice add and delete dialogs as well, it is just as easy.

Enjoy!

posted on Friday, September 18, 2009 6:37:59 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Friday, September 11, 2009

When SQL Data Services (now SQL Azure) released its first CTP it did not look anything like SQL Server: there were no tables, stored procedures, views, etc. With the new CTP, SQL Azure embraces SQL Server in the sky and supports the relational model, including stored procedures. This is good since there are millions of lines of stored procedures out there in production today and migrating them to SQL Azure is pretty easy.

I decided to give a simple stored procedure a test drive. I opened up SQL Management Studio, started a new query, entered in my SQL:

   1:  CREATE Procedure sel_CustomerOrder   
   2:  @CustomerID char(5)   
   3:  AS
   4:  SELECT o.OrderID, o.OrderDate,  p.ProductName,   
   5:       (od.UnitPrice*od.Quantity)-((od.UnitPrice*od.Quantity)*od.Discount) as TotalCost   
   6:  FROM orders o   
   7:       inner join [order details] od   
   8:              on o.OrderID=od.OrderID  
   9:       inner join products p  
  10:              on od.ProductID=p.ProductId  
  11:  WHERE CustomerID=@CustomerID
  12:  ORDER BY o.OrderID

 

I hit F5 and all was right with the world. So I decided to try it out:

Exec sel_CustomerOrder @CustomerID='ALFKI'

And got the expected results.

image

You can now use this stored procedure in all the applications that you connect to since SQL Azure supports standard ADO.NET connection strings. Pretty cool.

posted on Friday, September 11, 2009 12:48:44 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback
# Thursday, September 10, 2009

I like the idea of a database in the cloud. We have sort of been doing it for years, connecting to a SQL Server database over 1433. With SQL Azure, we take this one level further and let Azure worry about the hardware. So we don’t have to worry about scaling out with the bandwidth, RAIDs, etc. 

Last week I showed the basics on how to migrate data from a SQL Server 2008 database to SQL Azure. Yesterday I showed using SQL Azure with Telerik OpenAccess  and the WCF REST Toolkit. Today I will show how to build a simple REST based application using ADO.NET Data Services (aka Astoria.)

To get started we need a SQL Azure database. See my previous blog post about the CTP and getting data into your SQL Azure database. Once you have a SQL Azure database all set up let’s get to work.

The next thing we need to do is to create a new Web project and create our Entity Framework data model. I’ll go ahead and create an Entity Data Model against my local SQL Server 2008 Northwind database that has the same schema as my SQL Azure one. This is because SQL Azure and the Entity Framework chokes on the designer (or at least my version!) I will map:

  • Customers
  • Orders
  • Order Details

Now that my EDM is all set up, I will go in and change the connection string in my web.config to use SQL Azure. Here is my new connection string:

<add name="NorthwindEntities" 
connectionString="metadata=res://*/Northwind.csdl
|res://*/Northwind.ssdl|
res://*/Northwind.msl;
provider=System.Data.SqlClient;
provider connection string=&quot;Data Source=tcp:tpzlfbclx123.ctp.database.windows.net;
Initial Catalog=Northwind_Lite;
Integrated Security=False;UID=Stevef;PWD=GoMets!;
MultipleActiveResultSets=False&quot;"
providerName="System.Data.EntityClient"/>

You have to manipulate the EF connection string and put in the SQL Azure server name of your CTP in the “Data Source” and put in the database name in the Initial Catalog, turn off integrated security and put in the UID/PWD from the CTP. I set MARS set to false since SQL Azure does not support MARS.

Now let’s create the Astoria Service. Add a new “ADO.NET Data Service” to your project. I named mine NwindRestService.

image

Astoria can’t make it any easier for you to get the service up and running. All you need to do is set up the name of your EDM in line 2, in our case it was NorthwindEntities and also set the access permissions on line 8. I just uncommented the generated line and put in an “*” so all of my entities will inherit the AllRead access rule. With that we are good to go!

   1:  //Enter the name of your EDM (NorthwindEntities)
   2:  public class NwindRestService : DataService<NorthwindEntities>
   3:  {
   4:      public static void InitializeService(IDataServiceConfiguration config)
   5:      {
   6:          //Must set up the AccessRule, here I allow read only access
   7:          //to all entities. I can also do this one by one.
   8:          config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);
   9:      }
  10:  }

 

For the reality check, let’s run the service in the browser, being a RESTful service, Astoria will allow you to browse all of the Customers by typing in this URL:

http://localhost:1075/NwindRestService.svc/Customers

We should see this:

image

I also edited my first row in Northwind (ALFKI) to say “SQL Azure” at the end of the customer name so I know I am working with the SQL Azure and did not mess up my connection strings. That is it, you now have a RESTful service that is hooked up to SQL Azure.

The hard part is over. Now let’s build a simple ASP.NET client to consume the RESTful data.

First you have to set a reference to your service. This will give you a proxy to write some LINQ (to Astoria) code against.

image

 

Next we will create a simple ASP.NET GridView control and bind some data to it on the page load event. (Sure we can do a lot more, but this is just to get our feet wet with SQL Azure.)

   1:  //the address of our service
   2:  Uri url = new Uri("http://localhost:1075/NwindRestService.svc/", UriKind.Absolute);
   3:  //a ref to our proxy 
   4:  ServiceReference1.NorthwindEntities dat = 
   5:          new ServiceReference1.NorthwindEntities(url);
   6:   
   7:  //link statement to get the data, can use WHERE, Orderby, etc
   8:  var customers =
   9:      from c in dat.Customers
  10:      where c.Country == "Germany"
  11:      orderby c.CustomerID
  12:      select c;
  13:   
  14:  //bind to the grid
  15:  GridView1.DataSource = customers;
  16:  GridView1.DataBind();
 

This is pretty basic code from here.  Line 2 is a Uri reference to our service (which is technically in the same project, but it could (and should) be in a different project.) Line 4-5 is setting up a reference to the proxy we created and this is also our data context, representing the Astoria service. Lines 8-12 is a simple LINQ to Astoria statement to filter by the German customers (look like LINQ to SQL? That is the point!) and Lines 15-16 is where we do the data binding to the ASP.NET GridView. Our gridview looks like this, notice the ALFKI record says it is coming from SQL Azure:

image

That is all there is too it. Enjoy.

posted on Thursday, September 10, 2009 5:39:59 AM (Eastern Daylight Time, UTC-04:00)  #    Comments [0] Trackback