# Thursday, November 13, 2008

After 4 days at TechEd here in Barcelona, I have done 6 sessions and have 2 to go tomorrow. They are:

DAT02-IS

Data Access Smackdown

Interactive Session

Database Platform

11/14/2008   10:45AM - 12:00PM

DVP04-IS (R)

Tech·Ed Daily Scrum!

Interactive Session

Development Practices

11/14/2008   3:15PM - 4:30PM

posted on Thursday, November 13, 2008 10:06:13 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Monday, November 10, 2008

A lot of RDs arrived in Barcelona yesterday to start TechED Europe 2008. We went out for some tapas and sangria last night and the bill came to 779 euros. What happens when you get 15 RDs trying to split the bill with their credit card? Chaos.

 

bill

posted on Monday, November 10, 2008 10:05:26 AM (Eastern Standard Time, UTC-05:00)  #    Comments [2] Trackback
# Saturday, November 8, 2008

Here is a cool TSQL tip I will be showing off next week at TechEd Europe at one of my sessions (Rock Star Common Table Expressions).

Let’s say you have a product table that looks like this:

image

As you can see in the image, we have some duplicate data. You can use a Common Table Expression in SQL Server 2005 or 2008 to look up the “parents” using an aggregate HAVING clause. Then you join to the CTE to the Product table on the Product_Name field as well as the Product_ID field, but using the > indicator, so we only return the “children” or products with a higher product ID. Here is the CTE and code:

--Find the Dupes with a CTE

WITH CTEMinProductRecords

AS

(

 SELECT MIN(Product_ID) AS Product_ID, Product_Name

 FROM Products

 GROUP BY Product_Name

 HAVING COUNT(*) > 1

)

SELECT cte.Product_ID as DupeProductID,

cte.Product_Name as DupeProduct,

p.Product_ID as ParentID, p.Product_Name as ParentProduct,

p.Price as ParentPrice

FROM Products p

JOIN CTEMinProductRecords cte ON

 p.Product_Name = cte.Product_Name

 AND p.Product_ID > cte.Product_ID

Here are the results:

image

Let’s say you want to automatically delete the children. While the business case for this may exist (my old business did have a rule, the higher product id was the dupe), you will want to update all of the “many” tables to the lower product ID first. To then do the delete using the CTE all you have to do is convert the above select statement to a delete statement:

WITH CTEMinProductRecords

AS

(

 SELECT MIN(Product_ID) AS Product_ID, Product_Name

 FROM Products

 GROUP BY Product_Name

 HAVING COUNT(*) > 1

)

DELETE Products

FROM Products p

JOIN CTEMinProductRecords cte ON

 p.Product_Name = cte.Product_Name

 AND p.Product_ID > cte.Product_ID

Enjoy!
posted on Saturday, November 8, 2008 4:30:25 PM (Eastern Standard Time, UTC-05:00)  #    Comments [2] Trackback
# Friday, November 7, 2008

I am a data guy. Maybe that is why the NHibernate Mafia likes to cal me a database weenie. Next week at TechEd Europe I am doing 5 sessions about data. Friday morning I am doing DAT02-IS Data Access Smackdown. As I said before, it is not really a smackdown, but a look at the SP1 technology and then have a discussion on how best to make choices about which technology to use in your projects. One of the things that I love is data as a service. That is one reason why I am so solidly behind the Astoria Project at Microsoft (aka ADO .NET Data Services.)

One thing that I show in the Smackdown session is a simple LINQ to REST demo. Using LINQ you can go against a raw REST based data service. (You can also do this via a proxy, which I will also show at the session and at a later blog post.)

I have a simple Astoria service here. Let’s take a look at how to talk to it via LINQ to REST. First you have to set a reference to System.Data.Services.Client and then pull in the namespace like so:

using System.Data.Services.Client;

Next you have to create an anonymous type to hold your data. Since we are modeling the Customer entity in my Astoria service you have to model the type to have exactly the same data types:

public class Customer

    {

        public string CustomerID { get; set; }

        public string CompanyName { get; set; }

        public string ContactName { get; set; }

        public string ContactTitle { get; set; }

        public string Address { get; set; }

        public string City { get; set; }

        public string Region { get; set; }

        public string PostalCode { get; set; }

        public string Country { get; set; }

        public string Phone { get; set; }

        public string Fax { get; set; }

 

    }

Next inside of a console application we create a new Uri to point to the Astoria service on my server. Then we will query the service using the Astoria Uri syntax ?$filter= and add that to the Uri. After that is real simple, just loop through the customers and do whatever you want with them. Pretty easy! In a future blog post I will show you how to use the traditional LINQ operands (from, where, select) against an Astoria service.

static void Main(string[] args)

        {

            Console.Title = "Linq to Rest!!";

 

            Uri url = new Uri("http://stevef.goes.com/northwindservice/NorthwindService.svc/");

 

            DataServiceContext ctx = new DataServiceContext(url);

 

            IEnumerable<Customer> customers = ctx.Execute<Customer>(

                New Uri("Customer?$filter=Country%20eq%20'Germany'", UriKind.Relative));

 

            //write it out to the console window

            foreach (var c in customers)

            {

                Console.WriteLine(c.CompanyName);

            }

            //keep the window open

            Console.Read();

        }

posted on Friday, November 7, 2008 11:44:44 AM (Eastern Standard Time, UTC-05:00)  #    Comments [1] Trackback
# Thursday, November 6, 2008

Next week at TechEd Europe I will be doing two talks on Scrum (with one repeat) and we are trying something new at TechEd this year, so let me know what you think.

The talk on Tuesday, DVM309: Using Scrum to Run Your Projects, is a typical TechEd breakout session in lecture format with Q&A encouraged. I’ll go through slides and examples from my experience as a scrum master (and also share some of my experiences from the certified scrum master class.) This is a good overview of Scrum good for beginners or experienced scrum masters trying to scale out scrum.

On Tuesday and Friday we turn the tables in DVP04-IS: The Tech*Ed Daily Scrum! This is an interactive session where I will be passing around a microphone and it will be 100% Q&A, war stories, and interactive, no slides if I can help it. (Come on, ask a lot of questions, tell a lot of war stories make my week a little easier!) I have done the “Daily Scrum” talk about 10 times this year in several places (New York, TechEd US in Orlando, Egypt, Pakistan, Netherlands, Bulgaria, Serbia, Connecticut .NET User Group, etc) and every time it is different and exciting. I always learn something from the audience as well. Everyone is welcome, you will see how Scrum works in the real world as well as real life implementations. Since it is mostly interactive, it is great for people who want to learn about scrum, as well as experts in Scrum. My only rule is no religious warfare, other than that, anything goes! (Just ask the Serbians, it was also the last session of their conference and we all drank beer as we did the Q&A.)

See you all there… If you can’t make it, I hope they will film it and put it online.

posted on Thursday, November 6, 2008 8:20:19 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Wednesday, November 5, 2008

Yesterday I showed how there are some new language constructs in TSQL that will be very familiar to C# developers. I will be doing three talks on TSQL next week at TechED Developers in Barcelona and the talks are designed for application developers. (But database geeks will enjoy these sessions as well!) If you write any TSQL code in your applications today, or interact with it, these sessions will give you some advice and help change the way you approach particular problems.

Today I want to show a new feature about using Table Valued Parameters (TVP) in a stored procedure and then calling it from an ASP .NET page or any other .NET client using ADO.NET. A TVP is new in SQL Server 2008, so you can now declare a type of TABLE, insert data into it, and then pass it to a stored procedure. Have you ever passed in a CSV string to a DAL or stored procedure or even an XML data type, then parsed it and executed some SQL? TVP will eliminate that.  Here is an example:

First create a new table for us to play with:

Create Table TVP_TEST

(Field_ID int primary key, Field_DESC varchar(10))

Next create the TVP by declaring a new type. Notice that I have created the type to be exactly the same shape as the underlying table, this is not a requirement, however, it is important to do so if you are going to be doing inserts as we do below.

 

CREATE TYPE TVP_TESTType AS TABLE(

     Field_ID int,

     Field_DESC varchar(10))

 

Next create a variable as type TVP_TESTType and insert some data into it, insert is done the same way we would add data to a normal table. Notice that this only puts data into the variable, NOT the underlying table:

--put data into the type

DECLARE @MyTable as TVP_TESTType

 

--put data into the type

INSERT INTO @MyTable VALUES

     (1, 'Test'), (2, 'Next Test')

 

Now you can insert data into a table very easily (remember @MyTable has to be in scope):

INSERT INTO TVP_TEST

     SELECT * FROM @Mytable

 

To get really useful, let’s create a stored procedure that will accept as a parameter the TVP_TESTType data type and insert data into the TVP_TEST table using the parameter, notice no parsing of CSV strings, etc:

CREATE PROCEDURE usp_ins_MyTable

(@MyTVP as TVP_TESTType READONLY)

As

INSERT INTO TVP_TEST

     SELECT * FROM @MyTVP

You can call this stored procedure as so, as long as @MyTable is in scope and is filled with data:

 

exec usp_ins_MyTable @MyTVP=@MyTable

But what is really exciting is calling this data from a .NET client. If you have an ADO .NET dataset that is the same shape of the TVP you can pass it in from the client to the server, greatly enhancing data access and batch operations.You can pass in a dataset as a parameter using the SqlDbType.Structured as its data type.

 

Here is some code from C#, just remember that ds is our active dataset filled with data and shaped the same way as TVP_TESTType:

 

SqlCommand cmd = new SqlCommand("usp_ins_Portfolio", conn);

cmd.CommandType = CommandType.StoredProcedure;

//add the ds here as a tvp

SqlParameter sp = cmd.Parameters.AddWithValue("@Portfolio", ds.Tables[0]);

//notice structured

sp.SqlDbType = SqlDbType.Structured;

cmd.ExecuteNonQuery();

Pretty easy! Now you can pass data sets in as parameters to a stored procedure, eliminating some funky parsing on either the client or server.

posted on Wednesday, November 5, 2008 10:59:27 AM (Eastern Standard Time, UTC-05:00)  #    Comments [1] Trackback
# Tuesday, November 4, 2008

I am doing a talk next week at TechED Developers in Barcelona about new TSQL enhancements for developers. The session is:

 DAT313 T-SQL Enhancements in SQL Server 2008 : Getting Down and Dirty. it is Wednesday at 1:30 in room 115.

The goal of this talk is to show you some cool new stuff in TSQL but also to show C# and VB developers that TSQL is not all that bad. I get the impression that developers hate TSQL. :)

So the TSQL designers decided to add a few “delighters” to the syntax to make C# and VB developers feel more at home.

The first is variable assignment at declaration time. For example:

DECLARE @MSFT char(4) = 'MSFT'

Print 'Using: ' + @MSFT

The next is using shortcut syntax like += or –+, etc:

UPDATE Portfolio

     SET Qty += 10

     WHERE Symbol='MSFT'

The addition of Table Value Parameters (TVPs) allows you to pass in datasets via ADO.NET in C#/VB as a parameter to a stored procedure, etc, will really  make C# developers life easier (demo soon on the blog). Also cool integration with Google Maps or Virtual Earth and the geospacial data types will make developing apps with location much easier.

There are lots of other cool new features in there for developers, I will post some more on the blog soon.

posted on Tuesday, November 4, 2008 10:20:51 AM (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Monday, November 3, 2008

I have been a fan of the cloud since Amazon first released its first APIs. We have been waiting for Microsoft to enter in the cloud space and we have been seeing some stuff drip out over the last year, Astoria (while not cloud, it is a RESTful service that allows us to be cloud ready), Live Mesh (which people confuse as a consumer offering, but actually is a development platform), and SQL Server Data Services (SSDS).

Last week at PDC, Microsoft spoke abut Windows Azure, its cloud services platform. It will consist of web application and service hosting, .NET Services (basically the BizTalk services stack), storage, and data services (SSDS, or now just SDS). Some developers at the PDC were like “this is like the ASP model ten years ago, Azure is doomed to fail.” So the question is, will Azure succeed where the ASP model failed?

The ASP model was about a generic application hosted by an ISP and sold as a service to you. Picture instead of using ADP for your accounting software, you would log onto the ADP web site and use it as a service. This model did not completely fail, but it did not deliver on its mission. It was also a lot of .com hype and about 10-15 years ahead of its time with both the technology and the business acceptance.

While things like Live Services and hosted Exchange is part of Azure, Azure is not about ASP, but about Hosting your app, services, and data in the cloud. There is a need for this: Amazon EC2 and S3 are quite successful, even with the crowd that you think would never put their data in the cloud: Banks. It will take time, but companies will buy into this paradigm and make the shift. The first thing to go into the cloud in masse will be hosted Exchange, then file server capabilities, then applications, then data. Small businesses will go first. It may take years for the shift to be complete, but it will happen. It just makes sense to have your applications hosted in the cloud, why bother and worry about the infrastructure. Infrastructure will be a commodity by 2012. By then most new apps will be hosted in the cloud or using the cloud infrastructure for .NET Services or data.

Only 12 years too late! During the .com era, when I was a CTO of a large .com, I spent 65% of my time worrying about the infrastructure (bandwidth, RAID arrays, load balancing, switches, etc.) Years later at Corzen to support our spidering engines, I focused on infrastructure about 50% (only reason why it was lower than .com era was due to virtualization.) Now I need more bandwidth, more load balancing, it is just a click of a button. Sure it is not going to be this easy, but even if it delivers on 50% of its vision, it will reduce my focus on infrastructure by 95%.

.NET Services (formerly BizTalk Services) in the cloud will get adopted by developers as it matures and as apps get moved to the cloud. SQL Services will get adopted in version 2 when you can do more relational features just as tables, joins, views, etc, instead of the “flexible entities” approach of the first release.

Bottom line is that Azure will succeed, but it will take time for the world to catch up to Microsoft’s vision. Amazon (and to some degree Google) have paved the way.

posted on Monday, November 3, 2008 9:56:29 AM (Eastern Standard Time, UTC-05:00)  #    Comments [2] Trackback