Tuesday, December 09, 2008

PDC2008 - Day 4

My forth day at PDC was dominated by two very interesting sessions regarding RESTful web service design and creating textual domain specific languages with the new "M Grammar" which is part of the "Oslo" project.

RESTful Web Services

REST is a very interesting architectural style to create Web Services based on simple standards like plain HTTP, heavy use of URIs and simple data formats like XML, JSON or ATOM. RESTful services deliberately avoid the more complex WSDL/SOAP world and trust in the power of HTTP. This PDC showed, that Microsoft itself makes heavy use of the REST idea in several areas.

Azure and Live services use REST in combination with the ATOM Pub format in order to hyperlink entities in a uniform and simple way - thus leveraging possibilities for dynamic service clients.

The PDC session "WCF: Developing RESTful Services" by Steve Maine and his colleague Ron was one of the highlights at this PDC, because both speakers made a great job to outline the basic REST ideas and motivate the use of  this approach in combination with WCF. WCF itself supports REST with its WebHttpBinding since version 3.0. But the WCF team has just released an add-on package during PDC in order to further simplify the creation of solid RESTful services with WCF. The speakers showed some great demos on how to

  • use attributes in order to route HTTP verbs to the correct method
  • use newly created exception types to report correct HTTP status codes in a very .NET-friendly way
  • create services that easily expose and clients that consume your data via JSON (for AJAX clients) or ATOM pub for smart clients
  • how to use tools like Fiddler to inspect REST conversations
  • how WCF offers metadata for RESTful services via ATOM Pub

You should be able to find the add-on package in the meantime via http://msdn.microsoft.com/wcf/rest. I would also strongly recommend to watch the video of this great session, if you are interested in this topic.

"Oslo": Building textual DSLs

Chris Anderson and Giovanni Della-Libera gave a demo-focussed talk about "M Grammar" and its use to create your own textual DSL. They showed, what it takes to create a flexible textual DSL for telephone contacts like the following line:

contact: John Doe 2334-2345-222

The demo showed how to build up tokens, syntax trees, white-space handling, recursions etc. Chris and Giovanni revealed the real power of the "M Grammar" with several samples and created their contact language in an incremental fashion. Currently the usage of the DSL still seems in a very early stage - features like LINQ and dynamic types will need to be implemented before a final release.

The audience was very interested - many language gurus had deep questions regarding details of the new language and gave Microsoft lots of interesting ideas for the next months and years.

I'm very curios, where this journey will lead the development on the .NET platform - in my opinion it might have a big impact in how we develop in the future.

Summary

In my opinion "Oslo" was _the_ technical innovation at this PDC. I think the language and its tools will undergo heavy refactorings during the next 12 months - but it will be a big leap into the right direction.

Windows Azure and its services will help fast growing companies to host their environment and to avoid heavy investments in on-premise hardware. Azure and its Internet Service Bus are also a great opportunity to build supply-chain solutions or connect enterprises in a very elegant fashion - without  sacrificing security investments - Azure just federates between the custom already-established security systems. Azure also allows event-driven and pub-sub architectures between enterprises behind firewalls - which was hard to establish before without opening security holes.

Windows 7 is again a new Windows OS - and doesn't seem to be a BIG release from the perspective of a software engineer. But it will improve certain common problems like working in different networks etc. It will find its customers and make fewer problems than Vista, because the Vista device driver model isn't changed and Microsoft seems to have learnt some of the Vista lessons...

VSTS 2010 and .NET 4.0 will be very big releases. Microsoft pushes heavily in many different areas. WF is strongly improved and is used in many other products. The C# compiler will be heavily improved in v4. WPF and Silverlight merge stronger together and learn from each other. XAML is further improved and designed to be a solid basis for vast parts of the platform.

Last but not least: Microsoft apparently watches developments in the community and reacts quite fast (regarding the size of the company) to new trends like dynamic languages, REST and cloud or parallel computing - without sacrificing investments of the past. This makes the .NET platform a solid basis for application and service development, which will reach even higher maturity with v4.

Thursday, October 30, 2008

PDC2008 - Day 3

Microsoft Research Keynote

Day 3 of PDC was opened by Rick Rashid - lead of 800 researchers at Microsoft Research. Rick gave some interesting insight into his life and the current work of one of the largest research organizations of the world. Especially demos about world-wide telescope and SecondLight - a new version of the multi-touch Surface computer - were fascinating. But they were all topped by Matt MacLaurin and his Boku project - a graphical programming environment for children. It is designed to enable kids to program impressive little games using a game controller. Details and screen-shots can be found here.

Internet Service Bus

My next session was Clemens Vasters talking about Azure Services as an Internet Service Bus. Clemens did a great job in motivating the new cloud services and their technical details.

Central point is a service registry, which can be found at http://servicebus.windows.net/services/.

Services hosted on a local server can be registered in the web under this new domain. Service clients can address their calls to the servicebus directory, but are instantly re-routed to your local service implementation. This happens in a very intelligent fashion and depending on the binding you select. Clemens has some excellent graphics in his slide-deck illustrating the process in detail.

Azure services use the well-known WCF programming model, but are heavily based on web standards - thus enabling interop with Java, which Clemens promised to show in a session tomorrow...

The biggest advantage of the Azure service platform is the possibility to use it as a relay which establishes a direct, bidirectional connection between a client and a service despite firewalls and NAT. This is done via socket forwarding and port probing. Client and server only need to create outbound connections into the cloud in order to start communication. The rest of the job is done by the Azure fabric, which snaps the sockets to match each other and gets out of the way of the normal service communication. This all can and shall be done with message level security. Authentication and authorization features are also provided, their use is strongly recommended.

These features finally enable secure pub-sub solutions between enterprises. Clemens called it "pervasive, secure connectivity for services" and a "DMZ in the sky"...

I recommend watching the video of this excellent presentation as soon as it is available online!

VSTS 2010 Architect Edition

My next topic was an interesting discussion with Peter Provost, Jeff Brown, Christian Binder and my colleague and VSTS-specialist Klaus Liebe regarding the future of VSTS modeling, model merging and other very interesting stuff regarding VSTS project handling in general. It was great fun having direct influence onto an important area of the tool. Peter's presentation later on showed the vast interest of many developers regarding the new modeling features in VSTS 2010 architect edition. Now we got the newest CTP bits and can have a closer look into the newly created functionality.

Offline-enabled Data Services

Another very interesting session today was by Pablo Castro who introduced his project "Astoria offline". The presentation showed some very interesting problems if you want to create an Outlook-like occasionally connected system. The project team wants to create a solution to make services available offline by using technologies like ADO.NET data services (again REST...), Microsoft Sync Framework and the ADO.NET Entity Framework. Nice point is, that the solution is created with certain building blocks and you can replace the different technologies with technologies of your choice as long as you meet certain criteria.

Work on this project is still in a very early stage, but the guys definitely take the right direction to tackle these heavy problems of modern application design.

PDC2008 - Day 2

Keynotes 2 & 3

The second PDC day was again introduced by Ray Ozzie. The main topic of the keynote was to create holistic solutions for PC, Web and phones in order to create synergies and get the best out of the different devices. Ozzie and his colleagues introduced important new technologies as important blocks for this interesting vision on the Microsoft platform.

Demos of Windows 7, Azure Services, Live Mesh and the Web version of Office 14 showed interesting scenarios, how this idea might look like in the future.

Steve Sinofsky presented some fancy new features of Windows 7, most of them regarding the skinning, like a new task bar. But Windows 7 offers some really nice convenience features, we all would have needed since Windows XP:

A feature called "homegroup" distinguishes office and home environments for laptop users. This enables features like automatic swapping of a default printer - depending of the network you are currently working in. But "homegroup" offers more: One nice feature is a mechanisms to synchronize media and documents between all the different connected home devices. Thus it is possible to play music stored on other computers in the "homegroup".

A feature called "libraries" enables the user to create sort of  logical folders, which can be used to represent content from different physical folders at a common location in the explorer. One use case is to create a music library in order to scroll through music which is physically distributed in My music, an USB drive folder and a network folder, but presented in a common "library".

Next was Scott Guthrie showing off some different new technology stuff. Most interesting to me was that VS 2010 will be built with a completely new WPF-based GUI built on the Managed Extensibility Framework. This enables powerful extensions, especially for the code editor. ScottGu showed a small demo, in which he dynamically replaced the code documentation above the C# methods within the new code editor. Instead of the plain old
/// comments
he showed a nice embedded WPF control, which contained the comment text with hyperlinks into work items, that were referenced in the documentation text.

Another demo of Live Services and Live Mesh showed that Microsoft put also a lot of effort in these services in order to share data between devices, again through synchronization features.

BBC gave an impressive show-case demo of their  iPlayer v2, which is based on Silverlight technology and enables media consumption "on demand" and sharing of playlists and favorites between you and your friends in a very comfortable way.

Office 14 finally contains a Web edition. Demos for Office OneNote, Word and Excel showed that different users can work simultaneously on the same documents- regardless wether the use the desktop or web version of Office. Changed parts of the document are highlighted in the other user's application with the name of the other user. Changes are synchronized in an asynchronous fashion.

The third keynote of this PDC was given by Chris Anderson and Don Box. The session was extremely code-centric and showed the RESTful API experience if you want to address the new Windows Azure Services. Don and Chris generally are great presenters, but this time lost their audience several times, because they didn't motivate their quite entertaining show...

A modeling framework called "Oslo"

The remainder of my day was focused with the new modeling framework "Oslo". Oslo wasn't really mentioned in any of the keynotes, but might be one of the "big things" in the next years.

This new framework can be used to design textual and graphical Domain Specific Languages. New tools code-named "IntelliPad" and "Quadrant" help developers to design their own language and its graphical representation. The languages and their schema are defined using a new structural language called "M" which looks similar to JSON. "M" is compiled into a relational representation using the M compiler. The result is stored in a database in order to enable powerful queries against the models.

Oslo is still in a pre-alpha version - yet Microsoft starts to use it heavily in order to create its own DSLs for certain key areas. Prominent examples are MService to create a very short definition of service endpoints and MEntity - a very compact form of expressing object-relational mappings for the Microsoft Entity Framework.

Tuesday, October 28, 2008

PDC2008 - Day 1

Keynote

The first PDC conference day was opened with a key note by Ray Ozzie et. al. revealing Windows Azure - the new MS cloud  OS.
Windows Azure will serve as a third tier complementing the first (desktop & mobile clients) and second (enterprise servers) tiers. The presentation finally clarified Microsoft's Software+Service strategy - .NET developers will be enabled to enrich their applications with cloud services - either written by themselves and hosted on Microsoft's data centers or using Microsoft's Azure Service offerings. Identity federation might be one of the most interesting services leveraging enterprises' local Active Directory infrastructure as part of a claim-based, globally federated identity management system used "in the cloud". All in all the key-note was very focused on infrastructure aspects - thus it wasn't as thrilling to most attendees as other PDC keynotes in the past. Nevertheless it shows the big shift in Microsoft's business from product to service offerings - as I already expected in my post yesterday.

VSTS 2010

Cameron Skinner gave several nice demos regarding some key scenarios, VSTS 2010 is built to solve. We heard about these "Rosario" features now for some time, but it was quite interesting to see them running live. Main focus of VSTS 2010 lies in testing and architecture capabilities. One of the coolest features is to reproduce a bug found during a tester session on the developer machine. The developer is supported by the bug work item containing several attachments - containing screen shots and a video showing what the tester was doing and experiencing during the test session. The developer can jump into the video at every test step. Historical debugging information with call stack and context information is supplied - the developer can see visually and code-wise what was happening during the test session. This feature obviously still needs some some tuning - but one could see clearly the path that is chosen.

I think architects will love VSTS 2010. It supports UML 2.1 diagrams and makes heavy use of modeling in several places. There are code-centric features like generating sequence diagrams from code  - a great basis for re-engineering tasks. VSTS also supports model-centric design - e,g. by providing layer diagrams which can formulate your intended dependency graph between your different software layers - these can now be enforced via  build strategies. As soon as a developer violates your architecture rules by using a reference to an assembly residing in a forbidden layer you can let the build break. Great feature for pro-active quality management.

C# 4.0

Anders Hejlsberg was once again my personal highlight of the PDC day. He's one of the few speakers being very profound in his message and equally smart in his presentation technique and motivation for his topics. Anders spoke about multi-paradigm requirements for programming languages and how C# 4.0 will cope with modern aspects like dynamic typing, declarative programming and concurrent computing. Especially the demos regarding the new dynamic keyword in C# 4.0 impressed the audience. This simplifies interoperability with dynamic languages like Ruby or Python, but also tremendously improves COM inter-op scenarios. I strongly recommend watching the recorded video of this PDC session!

ASP.NET MVC

My next session was about ASP.NET MVC. I didn't learn anything tremendously new, but all my feelings about these bits were supported - this is cool stuff, which has to be watched closely. It is built by a small team in an incremental fashion and heavy community support and reviewing. ASP.NET MVC will be released end of year 2008 as an ASP.NET add-on. All the interesting aspects of Rails are ported to the .NET world leveraging the REST-approach, DRY and Convention over configuration. ASP.NET MVC is going into the right direction - the only problem is: forget about your well-known ASP.NET controls - this is a different world. Instead Microsoft trusts heavily in jQuery - an open-source  JavaScript library which now seems to become very important and can be used to implement some of the Web 2.0 glitter...

WF 4.0

This was my last session for today. It started quite high-level, but revealed some important news: My bad feelings about the current and - in my opinion - overloaded and complex WF design aspects in .NET 3.0 and 3.5 were supported. The team has decided to completely re-write the WF runtime for v4! It is now built on "Oslo" and the new modeling language "M", which will be revealed in greater detail in the keynote tomorrow morning.

Even the workflow designers now look completely different - they are built with WPF  technology. Workflows can now be expressed either graphically or textually be a specific DSL. Custom Activity design is claimed to be extremely simplified in comparison with the previous WF versions. Performance of the runtime is said to be increased by a factor of 10 to 100 - depending on the workflow scenario. Let's see how these promises behave in development reality...

Monday, October 27, 2008

PDC 2008 opened: "Think way outside the box!"

PDC 2008 was opened today with its pre-conference. The agenda looks pretty "cloudy" and even the posters show clouds - see below.

P1010692

What a shift for a company like Microsoft - the cloud and verbs like [scale], [interoperate] and [extend] might be more important than the announcement of Windows 7?!

I was attending the WPF pre-conference session held by Windows GUI guru Charles Petzold.

P1010690

Petzold gave the audience detailed and completely powerpoint-free demonstrations about his experiences with the WPF object model, best practices in control and template design, talked a lot about dependency properties and showed the enormous power of XAML scripting. It was quite fun watching him showing the basic concepts using his XAML Cruncher - even if he didn't mention any groundbreaking new stuff for WPF experts...

Thursday, July 31, 2008

REST versus WS* - a wonderful parable!

Have a look at this wonderful parable. There is much truth in it...

Thursday, June 26, 2008

Software Factories in practice...

My (German) talk at the "Microsoft Launch 2008" about my experiences with software factories during the last years has been published as a video by Microsoft. You will find it here.

Finally arRESTed...

It took me some time to understand the real benefits of RESTful architectures, but a combination of good books about Rails and RESTful WebServices, current project issues and this podcast filled the final gap to understand the benefits...

The idea of resource-centered, URI-based design alone brings real power to many kinds of applications. I'm just considering the possibility to address every important entity in our system by a simple URI - and my project mates seem to be pleased by this idea... 

PDC 2008 - see you in L.A.

I just registered for PDC 2008...  This will be my second visit to this great conference after 5 years of waiting for and finally using technologies Microsoft promised at PDC 2003.

Last time they told us about Windows "Longhorn", technologies like Avalon or Indigo. Vista, WPF and WCF became reality, WinFS and ObjectSpaces didn't make it ;-)

It was great fun to see guys like Don Box or Aaron Skonnard in several sessions and discuss with Andres Heilsberg about inheritence mechanisms of C# attributes.

This time we'll hear a lot about services in "the cloud" - it will be fun!

Sunday, February 03, 2008

Ruby on Rails was the hot topic @ OOP 2008

Rails was one of the hot topics @ OOP 2008 conference in Munich. It was amazing. .NET and Java guys alike were fascinated by different success stories about Rails and its "convention over configuration" paradigm.

Escpecially Dan North from Thoughtworks gave an impressing talk (without Powerpoint slides) about Rails being ready for the enterprise - or vice versa. He didn't forget to point out critical aspects of "hype technologies" like rails for long-term applications. What will happen in two years with this kind of framework, when famous front-runners like the pragmatic programmers Andy and Dave have detected more interesting technology?

This interesting aspect was further pushed in a follow-up discussion I had with one of our Ruby guys at Zühlke: Vassilis pointed out, that currently several new Ruby frameworks are created to cope with the faults built into Rails...

In the meantime convention over configuration has also reached the "big players". It will be very interesting to watch, how the success of Rails will affect future versions of .NET and JEE...

Microsofts DLR will be the first step into that direction.

Wednesday, April 18, 2007

Skills of a Chief Software Architect

I posted my opinion on this topic at the SEI site of Carnegie Mellon University. Paul Clements himself was asking in the IASA community, which duties, skills, knowledge and organizational support software architects should have.

Sunday, January 22, 2006

WS-Security with X.509 certificates

I tried several different tools to create reasonable X.509 certificates for WS-Security scenarios in the last few weeks. There are a lot of tools. You could use
  • MakeCert (Microsoft)
  • Keytool (Sun)
  • CertGen (Bea)
  • OpenSSL
  • and there are others...
My first and biggest mistake was, that I tried to use the same certificate (and its private key) for different server side technologies.
The problem is that the different technologies use completely different key stores (Java Key Stores JKS vs. Microsoft Certificate Store). This means you have to export and import certificates and their private keys from one certificate store and import them elsewhere. This sounds quite simple, but it is not, given the fact that there are several different formats for serialising X.509 certificates into files. Microsoft prefers PKCS12 and uses *.pfx files, if you export certificates from the Windows certificate store. It's not that simple to get those files imported into a Java key store. The easiest way we found to get them into a JKS is the PKCS12Import utility in Sun's JWSDP 1.6.
You can do this and it works - but: your test certificates do look like test certificates. Professional certificates for use in production - as they are issued by Verisign, e-Trust or others - look quite different. They contain lots of information in the subject entries and have a trust chain back to the trusted root certificate.
So I prefered a completely different way to be close to production reality: I ordered SSL test certificates from Verisign and used them in our demos. These certificates are usually bound to your web server and this is a perfect test environment:
  1. Use your computer's name as subject name in the certificate
  2. Set up the test certificate in a test web site on your machine, which is secured by SSL
  3. Test whether you can reach the site via https
  4. Now you can be sure your certificate is installed properly and is trusted
  5. Use this certificate in WS-Security scenarios.
Important: Repeat these steps for each of your server environments. Different web servers use slightly different certificates (see the Verisign sites - they want to know which is your environment). The odds are good that your certificate will work fine in WS-Security scenarios, if they work for SSL in the same technology stack...
A last important note: If you want to use these certificates in Web Services that are secured via SSL and transport layer security, be careful to use the correct URL on your client site: While http://localhost/MyService.asmx might work, the same might not be true for https. The certificate wasn't issued for a machine called localhost, so it will reject calls form the client. You must use your machine's name in the URL, in my case this worked fine:
https://fran0111/MyService.asmx, because fran0111 is my machine's name and is the subject name in the test certificate I used in the service's security policy.
I hope this post prevents you from some of the mistakes I did using certificates. It all sounds quite easy - but I'm afraid it isn't. Not for interoperability scenarios...

Monday, January 16, 2006

Heading for Munich

I'm just on my way to Munich. I'm giving a talk about Web Service policies and interoperability at OOP 2006 together with Klaus Alfert. It's gonna be fun. We're giving a high level overview on WS-* standards and drilling down into some interop scenarios between .NET 2.0/WSE3, BEA WebLogic Workshop 8.1.5 and Suns JWSDP 1.6.
Preparing this talk and one of the security demos, we both learnt lots about getting, storing and using X.509 test certificates and about the different tools managing them. It's a kind of nightmare, if you try to get the same certificate working on different server platforms. I'll post some more details on my experiences in this area in the next week...

Friday, December 02, 2005

GAT4WS

Last week at Microsofts Architects Forum in Munich Beat Schwegler, architect at Microsft EMEA in London, showed an awesome demo with an internally developed guidance package for web service development named GAT4WS.
It was by far the most convincing demo on GAT I saw up to now. The recipes make use of Christian Weyers design tool WSContractFirst and promote an interesting solution organisation with several logical layers. Main goal is to abstract the transport and web service aspects away from your business logic. It's a nice piece of automated guidance.
The only problem: It's not officially released until now...
(I think they wait for the next GAT release)

Wednesday, October 12, 2005

Guidance Automation Toolkit

I just finished my first experiments with Microsofts Guidance Automation Toolkit a.k.a. GAT. This toolkit is an interesting possibility for architects or chief programmers to offer guidance and best practices to their team - not just as written documentation as it can be found on Microsofts patterns and practices site. Using GAT you can create your own guidance packages containing a set of executable recipes for repetitive design and programming tasks in Visual Studio 2005.
GAT recipes are built to appear in a strictly context sensitive fashion. They offer their functionality just in those situations when they should be used - that's the main idea behind the concept. Recipes can be invoked via context menus or from the task pane and work in several steps. First they collect input data needed for their task. This might happen either through the help of extensible wizard dialogs or without user interaction at all (and rather through your code). Having the required input arguments one or more actions are invoked. These actions might manipulate your current project settings or - more typical - add some pieces of code to your project. Code injection is done through the use of a source file template mechanism. Recipes are defined declaratively as XML templates.
GAT builds on GAX (Guidance Automation eXtensions), which extends the possibilities of the Visual Studio API. I tried the latest Technology Preview from May 2005, which works with Visual Studio 2005 Beta 2 and can be downloaded here. It all works quite nice. My only major problem was editing the code templates. The VS editor isn't yet familiar with the template markup and tries to reformat your code in situations when you don't appriciate it...
Check out the community site and its Hands On Lab from Teched 2005, which uses GAT to automate certain data access tasks and explains the main concepts.
In my opinion this is another step into the right direction - building the foundations for Software Factories.

Saturday, September 17, 2005

Keep your models in sync!

During my last 3 years working for Zühlke as a consultant and software architect I've learnt to appreciate modelling in different projects. To be honest: There are several situations in which you’d like to express your thoughts in a way your tool doesn’t support. But if you are really lucky and have (one of the few) usable modelling tools, your whole team will gain architectural overviews, that are hard (or maybe impossible) to provide without graphical representations and abstractions of your work.Models are perfect for sketching and documentation, this is a well known fact - but they can do more for you. Typically models provide structural and behavioural views on your project as a whole. Rational defines several views in its Rational Unified Process (logical, process, component, deployment views) – you might need others for your specific project needs. One of the most important issues only few people think and talk about is model integration. Models tend to grow, especially if they’re used efficiently, and every team member uses parts of the model for his/her daily work. Sooner or later you need to split your model in several parts. Or you distinguish analysis and design models. First you create some high level analysis classes or components and add some typical sequences and interactions to get a better understanding of your target domain.
Later you drill down some technical issues in your design model. You literally “dive into” parts of your analysis model. You see errors and change your design because of technical restrictions, optimizations, or simply to correct some errors you made during high level analysis, because you’re recognizing new details about the problem at hand.
This is one of the most critical moments in iterative software design: Most teams forget to update their good old analysis model they used throughout the initial project iterations. Your architectural overview in the analysis model will become unclear - and as time passes by – simply false.
You need to attack this problem during the early days of your project. Don’t divide your team into those using the analysis model and others using the design model and code. Don’t create roles for architects, who never see the design model or code. These guys are known as “PowerPoint architects” – living in their own world. Be aware of them. You certainly need team members who are responsible for the overall architectural issues – usually both technical and business focussed architects. But keep them in touch with your daily development issues. You need them to regularly extract important design information back into their analysis model. The modern idiom for this extraction is called “harvesting” (see IBM/Rational Software Modeller).
Most modelling tools aren’t capable to automate this important harvesting step. And it is no simple task. How should a tool decide which design aspects are architecturally relevant and which are not. Sure – you can use stereotypes or something else to annotate certain classes’ relevance. But it is more difficult for behavioural aspects of your design. Imagine your high level analysis model showed some business-critical component and service interactions. What should happen, if you change certain design aspects of this interaction down in your design model? Could any tool tell with some certainty, if your overview is concerned by those changes? Can we define these rules? Do we have patterns for these decisions? I think we need much more experience in model integration, updating and cross model referencing before we will be able to automate those tasks…

Pragmatic Programming - the way towards your own DSL

I for myself am a passionate coder and usually prefer lightweight tools. It’s always fun to read books from the pragmatic bookshelf (see www.pragmaticprogrammer.com). Dave Thomas and Andy Hunt really hit the point with most of their opinions, which are taken from real life projects. Good programming skills are the most important basis if you want to develop and engineer larger software projects.
You want to automate certain aspects in your design and development process, as it is proposed by the ideas of Software Factories? Well, first you need to know how to accomplish these tasks manually in an elegant way. I think, it’s an important prerequisite, nobody talks about…
Remember these steps:
  1. Learn to program

  2. Learn to design

  3. Learn to model

  4. Learn to automate

  5. Design your automation through DSLs

Saturday, September 03, 2005

Software Factories

I've finally made it - I'm through more than 600 pages of this interesting book by Jack Greenfield and Keith Short. And it was worth while taking the time to understand the motivation behind (Microsoft's) Software Factory initiative.
The main point behind the idiom Software Factories is to use modern software development environments to industrialize software development for product lines. Software Factories use techniques like modelling with Domain Specific Languages, combined with patterns, frameworks, guidance and code generation to automate tedious, repetitive (!) and error prone hand-crafted code. The book describes the theory and ideas behind Software Factories, while Microsoft tries to deliver the proof of concept: The upcoming versions of Visual Studio Team System will contain numerous tools that give you the opportunity to create and visualize your own domain specific languages, synchronize models through transformations and automate and enforce your software development process through the new Guidance Automation Toolkit.
I’ll use my next blog entries to investigate some of the key points that are touched by Software Factories and try to illustrate them with issues from my own practical experience.

Wednesday, August 31, 2005

SOA versus component design

Scott Belware published a very interesting article on theserverside.net. Driven by several discussions and presentations at Microsoft’s TechEd in Orlando in June, he discusses today’s difficulties in application architecture, which is drifting between modern SOA ideas and the RAD tooling reality. I share many of his critical thoughts.
Let’s look at an example: Visual Studio .NET is a really powerful development environment with lots of powerful features, but doesn’t guide developers towards solid design decisions.
Many of the decoupling lessons learnt from the good old COM history seem to be lost in modern .NET development. Many .NET developers aren’t even aware of the possibilities and responsibilities that lie in the use of .NET assemblies as a modern component model. It was never as easy as today to create component based applications, using interfaces to decouple implementations, following best practices from Test Driven Development (TDD).
It was so much harder to create the very same effect using COM interfaces, C++ and ATL in the late 90s. The benefit was that though we had to design interfaces in IDL and express shared data types in strange type systems (beware of VARIANT and BSTR), we knew the component boundaries! These boundaries were so extremely and painfully visible, that it became one of the main reasons why Microsoft created .NET. But now that those problems are gone, nobody seems to be interested in components any more. Everybody talks about services.
I think most designers and application architects should take a step back: Design your applications using strictly separated components. Use small and distinct interfaces to establish controlled communication between these components. Define and model an application specific layered architecture that goes beyond those physical deployment tiers that can be found in many articles. The result will be an application that consists of many components which you can develop and test concurrently, understand and maintain more easily.
These applications and their components will be first class citizens in the SOA future of your enterprise, because it will be fairly easy to create coarse grained service interfaces on top of these well defined components.
What I want to say: SOA will not completely change the world of a component developer. It’s just the next evolutionary step to use well designed components in a bigger scenario. Don’t forget the lessons learnt from the days when component design was the modern art. SOA won’t change those principles – but it will change the remote access to your components.

Risks behind the SOA hype

Hi all,
I just published my first critical article about the SOA hype in the German IT magazine "Computerwoche", which can be found here - caution: it's written in German.
I thought that this event should finally be the trigger to open my own blog and share my personal thoughts on architecture, model driven development and software factories with the rest of the world. Hopefully somebody out there is interested...