# Thursday, 06 November 2008

Next week at TechEd Europe I will be doing two talks on Scrum (with one repeat) and we are trying something new at TechEd this year, so let me know what you think.

The talk on Tuesday, DVM309: Using Scrum to Run Your Projects, is a typical TechEd breakout session in lecture format with Q&A encouraged. I’ll go through slides and examples from my experience as a scrum master (and also share some of my experiences from the certified scrum master class.) This is a good overview of Scrum good for beginners or experienced scrum masters trying to scale out scrum.

On Tuesday and Friday we turn the tables in DVP04-IS: The Tech*Ed Daily Scrum! This is an interactive session where I will be passing around a microphone and it will be 100% Q&A, war stories, and interactive, no slides if I can help it. (Come on, ask a lot of questions, tell a lot of war stories make my week a little easier!) I have done the “Daily Scrum” talk about 10 times this year in several places (New York, TechEd US in Orlando, Egypt, Pakistan, Netherlands, Bulgaria, Serbia, Connecticut .NET User Group, etc) and every time it is different and exciting. I always learn something from the audience as well. Everyone is welcome, you will see how Scrum works in the real world as well as real life implementations. Since it is mostly interactive, it is great for people who want to learn about scrum, as well as experts in Scrum. My only rule is no religious warfare, other than that, anything goes! (Just ask the Serbians, it was also the last session of their conference and we all drank beer as we did the Q&A.)

See you all there… If you can’t make it, I hope they will film it and put it online.

posted on Thursday, 06 November 2008 08:20:19 (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Wednesday, 05 November 2008

Yesterday I showed how there are some new language constructs in TSQL that will be very familiar to C# developers. I will be doing three talks on TSQL next week at TechED Developers in Barcelona and the talks are designed for application developers. (But database geeks will enjoy these sessions as well!) If you write any TSQL code in your applications today, or interact with it, these sessions will give you some advice and help change the way you approach particular problems.

Today I want to show a new feature about using Table Valued Parameters (TVP) in a stored procedure and then calling it from an ASP .NET page or any other .NET client using ADO.NET. A TVP is new in SQL Server 2008, so you can now declare a type of TABLE, insert data into it, and then pass it to a stored procedure. Have you ever passed in a CSV string to a DAL or stored procedure or even an XML data type, then parsed it and executed some SQL? TVP will eliminate that.  Here is an example:

First create a new table for us to play with:

Create Table TVP_TEST

(Field_ID int primary key, Field_DESC varchar(10))

Next create the TVP by declaring a new type. Notice that I have created the type to be exactly the same shape as the underlying table, this is not a requirement, however, it is important to do so if you are going to be doing inserts as we do below.

 

CREATE TYPE TVP_TESTType AS TABLE(

     Field_ID int,

     Field_DESC varchar(10))

 

Next create a variable as type TVP_TESTType and insert some data into it, insert is done the same way we would add data to a normal table. Notice that this only puts data into the variable, NOT the underlying table:

--put data into the type

DECLARE @MyTable as TVP_TESTType

 

--put data into the type

INSERT INTO @MyTable VALUES

     (1, 'Test'), (2, 'Next Test')

 

Now you can insert data into a table very easily (remember @MyTable has to be in scope):

INSERT INTO TVP_TEST

     SELECT * FROM @Mytable

 

To get really useful, let’s create a stored procedure that will accept as a parameter the TVP_TESTType data type and insert data into the TVP_TEST table using the parameter, notice no parsing of CSV strings, etc:

CREATE PROCEDURE usp_ins_MyTable

(@MyTVP as TVP_TESTType READONLY)

As

INSERT INTO TVP_TEST

     SELECT * FROM @MyTVP

You can call this stored procedure as so, as long as @MyTable is in scope and is filled with data:

 

exec usp_ins_MyTable @MyTVP=@MyTable

But what is really exciting is calling this data from a .NET client. If you have an ADO .NET dataset that is the same shape of the TVP you can pass it in from the client to the server, greatly enhancing data access and batch operations.You can pass in a dataset as a parameter using the SqlDbType.Structured as its data type.

 

Here is some code from C#, just remember that ds is our active dataset filled with data and shaped the same way as TVP_TESTType:

 

SqlCommand cmd = new SqlCommand("usp_ins_Portfolio", conn);

cmd.CommandType = CommandType.StoredProcedure;

//add the ds here as a tvp

SqlParameter sp = cmd.Parameters.AddWithValue("@Portfolio", ds.Tables[0]);

//notice structured

sp.SqlDbType = SqlDbType.Structured;

cmd.ExecuteNonQuery();

Pretty easy! Now you can pass data sets in as parameters to a stored procedure, eliminating some funky parsing on either the client or server.

posted on Wednesday, 05 November 2008 10:59:27 (Eastern Standard Time, UTC-05:00)  #    Comments [1] Trackback
# Tuesday, 04 November 2008

I am doing a talk next week at TechED Developers in Barcelona about new TSQL enhancements for developers. The session is:

 DAT313 T-SQL Enhancements in SQL Server 2008 : Getting Down and Dirty. it is Wednesday at 1:30 in room 115.

The goal of this talk is to show you some cool new stuff in TSQL but also to show C# and VB developers that TSQL is not all that bad. I get the impression that developers hate TSQL. :)

So the TSQL designers decided to add a few “delighters” to the syntax to make C# and VB developers feel more at home.

The first is variable assignment at declaration time. For example:

DECLARE @MSFT char(4) = 'MSFT'

Print 'Using: ' + @MSFT

The next is using shortcut syntax like += or –+, etc:

UPDATE Portfolio

     SET Qty += 10

     WHERE Symbol='MSFT'

The addition of Table Value Parameters (TVPs) allows you to pass in datasets via ADO.NET in C#/VB as a parameter to a stored procedure, etc, will really  make C# developers life easier (demo soon on the blog). Also cool integration with Google Maps or Virtual Earth and the geospacial data types will make developing apps with location much easier.

There are lots of other cool new features in there for developers, I will post some more on the blog soon.

posted on Tuesday, 04 November 2008 10:20:51 (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Monday, 03 November 2008

I have been a fan of the cloud since Amazon first released its first APIs. We have been waiting for Microsoft to enter in the cloud space and we have been seeing some stuff drip out over the last year, Astoria (while not cloud, it is a RESTful service that allows us to be cloud ready), Live Mesh (which people confuse as a consumer offering, but actually is a development platform), and SQL Server Data Services (SSDS).

Last week at PDC, Microsoft spoke abut Windows Azure, its cloud services platform. It will consist of web application and service hosting, .NET Services (basically the BizTalk services stack), storage, and data services (SSDS, or now just SDS). Some developers at the PDC were like “this is like the ASP model ten years ago, Azure is doomed to fail.” So the question is, will Azure succeed where the ASP model failed?

The ASP model was about a generic application hosted by an ISP and sold as a service to you. Picture instead of using ADP for your accounting software, you would log onto the ADP web site and use it as a service. This model did not completely fail, but it did not deliver on its mission. It was also a lot of .com hype and about 10-15 years ahead of its time with both the technology and the business acceptance.

While things like Live Services and hosted Exchange is part of Azure, Azure is not about ASP, but about Hosting your app, services, and data in the cloud. There is a need for this: Amazon EC2 and S3 are quite successful, even with the crowd that you think would never put their data in the cloud: Banks. It will take time, but companies will buy into this paradigm and make the shift. The first thing to go into the cloud in masse will be hosted Exchange, then file server capabilities, then applications, then data. Small businesses will go first. It may take years for the shift to be complete, but it will happen. It just makes sense to have your applications hosted in the cloud, why bother and worry about the infrastructure. Infrastructure will be a commodity by 2012. By then most new apps will be hosted in the cloud or using the cloud infrastructure for .NET Services or data.

Only 12 years too late! During the .com era, when I was a CTO of a large .com, I spent 65% of my time worrying about the infrastructure (bandwidth, RAID arrays, load balancing, switches, etc.) Years later at Corzen to support our spidering engines, I focused on infrastructure about 50% (only reason why it was lower than .com era was due to virtualization.) Now I need more bandwidth, more load balancing, it is just a click of a button. Sure it is not going to be this easy, but even if it delivers on 50% of its vision, it will reduce my focus on infrastructure by 95%.

.NET Services (formerly BizTalk Services) in the cloud will get adopted by developers as it matures and as apps get moved to the cloud. SQL Services will get adopted in version 2 when you can do more relational features just as tables, joins, views, etc, instead of the “flexible entities” approach of the first release.

Bottom line is that Azure will succeed, but it will take time for the world to catch up to Microsoft’s vision. Amazon (and to some degree Google) have paved the way.

posted on Monday, 03 November 2008 09:56:29 (Eastern Standard Time, UTC-05:00)  #    Comments [2] Trackback
# Sunday, 02 November 2008

Ok, ok it is not a Smackdown, but more of a review of Sp1 technology: EF, Dynamic Data, Astoria, LINQ to REST. I will be doing a session at TechED next week in Barcelona, however, you can see a shortened version of the talk from my DevReach keynote.

posted on Sunday, 02 November 2008 10:30:52 (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback
# Saturday, 01 November 2008

Why not? I got home late last night from PDC and decided to stay up late, watch some TV and install my pre-beta version of Windows 7 on my spare laptop. The installation went very well and I decided to take some of the new features out for a spin.

While there are some bells and whistles out there for the consumer, however, for a developer I really like the the virtual machine support. I was able to load up my VHDs really fast and even set a VHD as a boot partition (wow!). The VHD support is so good that I am considering bringing this Windows 7 laptop to TechEd in Barcelona and do all of my sessions off the Win 7 laptop.

It has only been one day but Windows 7 appears very stable, has good power management, and is fast. It also is pretty slick looking with the Avalon UI stuff turned on. If my experience changes, I will report it back here.

posted on Saturday, 01 November 2008 09:34:39 (Eastern Standard Time, UTC-05:00)  #    Comments [3] Trackback
# Friday, 31 October 2008

Here are some conclusions that I have drawn from four days at the PDC. Microsoft did not say any of this, but I have drawn these conclusions based on their messages and behaviors at PDC. Here they are and by no means is this an exhaustive list, it is just based on my personal experience this week:

  • The Influence of Dynamic Languages. From the DLR to the dynamic static type in C#, you can see the effect of Ruby (and Python) on Microsoft. Anders is not a language purist and said that C# will draw on benefits from functional programming, dynamic languages and statically typed languages.  He also said that languages with a single focus are going doomed to fail and languages that are more flexible will succeed. To me this was the largest news from the PDC, it shows Microsoft’s reaction to “Web 2.0” and Ruby.
  • The Death of JavaScript. Microsoft never said this, however, two sessions showed me that it is to be true. The first main session at the PDC, Ander’s “Future of C#” showed how you can use the dynamic type to move your JavaScript to C# in the code behind and run on the client via SliverLight. The last session of the PDC, “ASP .NET AJAX Futures” was all about implementing client AJAX and with declarative programming you can pretty much eliminate most JavaScript.The AJAX talk was all about client side binding, the demos were even done in .HTM pages, not ASPx pages!
  • The Future of Web Development is SilverLight. ASP .NET 4.0 Futures talks were all about Dynamic Data and MVC, stuff we know already. Everything else was about SilverLight. See the addition of the dynamic type and the DLR, it is all about writing code behind SliverLight.  There were also talks about line of business apps built in SilverLight.The writing is on the wall, learn SilverLight.
  • Windows is Sexy Again. From the keynote to the “I’m a PC” tee shirts to the gazillion of sessions about Win 7, Windows is sexy again to develop on. Forget about Vista, that never happened. Our bad.
  • Cloud, Cloud, Cloud. Developers were underwhelmed about the cloud. Yea we get it, hosting our services and apps in the cloud is great, but did we really need 36 sessions on that? We have been using the cloud (Amazon EC2, S3) for a while now, so we get it. Microsoft proves once again that when Microsoft is behind in the market, they know how to over do it. The offering is very mature and robust, however, it seems like the message was overkill.
  • Where was SQL Server? I thought this was PDC and about the future. Only 1 slide about the future of SQL Server. They said it was too “early” however we had a SQL Services Futures talk (v 2) when v 1 has not even shipped yet! I am not implying that SQL is getting less investment, but there should have been a “here is what we are thinking about v.next” talk.
  • Oslo is Way Cool-But What Do I use it for? There were 5 sessions on this new pre-alpha thing called Oslo. It looked cool, especially the ability to author a contextual DSL. There were cool tee shirts and a large booth with cool swag. But the buzz for developers was “what do I use this for.” Microsoft did a great job telling us about the cool things in Oslo, but did not present the use case. I expect next year’s PDC to do that and have a ton of Oslo content.
posted on Friday, 31 October 2008 09:59:58 (Eastern Standard Time, UTC-05:00)  #    Comments [1] Trackback
# Thursday, 30 October 2008

My favorite technologies to come out of Microsoft this year is ADO.NET Data Services aka Project Astoria. Astoria is a new technology that puts a REST based data-service on top of your data. Very easily you can expose your application’s data as a service allow other applications, including non-Microsoft ones, to consume your service via HTTP and REST. Astoria has a rich extensibility model where you can hook directly into the service and take direct control over a request. This enables you to build very secure, flexible, and robust services.

The one thing that is a limitation of Astoria is that for the client, you have to be online to consume the service. If your client is offline or occasionally connected, working with Astoria is not really possible. That will change real soon. Yesterday at PDC I went to the session TL08 Offline-Enabled Data Services and Desktop Applications delivered by Pablo Castro. For those of you who don’t know Pablo, he is a great guy and a straight shooter. He believes in “transparent design” where the team is 100% open and debates design and features on its blog for the whole world to watch and contribute feedback.

I have been writing “disconnected” applications that rely on synchronization for almost 15 years. I have been involved in some great applications relying on replication like the NFL Scouting application I showed at TechEd 2003 that  uses the Pocket PC (.NET Compact on Windows Mobile/CE) and laptops with SQL Express to connect to a back end SQL Server. The challenge of the disconnected user has been around ever since we have been able to do any form of replication of data. I know first hand. :)

Astoria Offline’s offering is very elegant. You can create an end to end solution very easily and the “piece of cake” scenario where you click on all the wizards provides a good solution out of the box where you only have to write a little bit of code to extend the application to fit your needs. There is also a robust API if you want to consume and do everything yourself or if you are not using the Microsoft tools on the client.

What I really like about Astoria is its conflict resolution. This has been a big challenge for some other replication systems. Astoria will sync all of the good records in the batch and do nothing to the conflicted records. You will then have an exception and an event to handle the conflicted records so you can write some code to handle it yourself. But Astoria will not change any data unless you tell it to, giving you full control. While this seems small, it is a very elegant design pattern.

There are also some great things in there to allow you to batch your sync and also do partial replication (allow only NY sales people to see NY data) with something called Scopes. There are nice hooks for Authentication and Authorization as well.

Look for a CTP by the end of the year. As Pablo said, it is not a CTP good enough to build applications on, but good enough to give the team feedback. More than most other teams at Microsoft, the Astoria team really listens. So look for the CTP and write some code and let them know what you think.

posted on Thursday, 30 October 2008 11:50:27 (Eastern Standard Time, UTC-05:00)  #    Comments [0] Trackback