Saturday, November 27, 2004

Hire The Right People

For Joel Spolsky, the #1 cardinal criteria for getting hired at Fog Creek is to be Smart and to Get Things Done.


(BTW, why does blogger.com image upload only support JPEGs? (using "Picasa Hello"))

It's hard to find smart doers, but please, keep on searching for them. If you have the slightest doubt that a potential employee does not fulfill this criteria, don't hire.

The most dangerous species are those who are smart, but don't get things done. First of all, they are harder to be identified as such during the recruitment process. Checking their project portfolio helps (still, they usually are smart enough to fake it).

As opposed to not-so-mart doers, who cause operative mistakes (bad enough), smart no-doers are in the position to make strategic mistakes (even worse). Those are the kind of people who can talk their management into following doomed endeavors which will never result in any marketable product, sometimes putting the whole company at risk. If combined with weak social skills and when put into charge, smart no-doers are best way to get rid of your last smart doers.

About O/R Mappers

We all like object-oriented programming, right? SQL, well... it's cool to have SQL code generated. And your domain object model, too. Some Not-Invented-Here experts will even propose to build a home-brewn O/R mapper. After all, they can do a better job than those Hibernate guys, now don't they?

I know O/R mappers look tempting, and even more tempting is to build one on your own, at least to some architecture astronauts. Things might look promising unless... people actually start applying it. But at this point in time, the investment has been made. No way back. And more than once, the developers who built the O/R mapper are not the same who actually use it, and the blame game is about to begin.

I have seen applications stall, projects fail, people quit, companies go south thanks to the overly optimistic usage of home-made, unproven O/R mappers. Building an O/R mapper is a long and painful road. Avoid unnecessary database roundtrips, provide the right caching strategies, don't limit the developer ("but we were able to do subselects in SQL"), optimize SQL, support scalability, build a graphical mapping tool and so on. Some people designing SOA / message-oriented systems just DON'T WANT their database model floating around within the whole application, but that's what is most likely going to happen.

Will it support all kind of old legacy systems, including some bizarre / not-normalized database designs? And yes, sometimes your programmers don't know the consequences of a virtual proxy being expanded. Only database profiling will show what goes on behind the hood. "One Join" is certainly the preferred solution in comparison to "N Selects", but this is porbably not going to happen once you traverse over a 1:N relationship. Will your caching algorithm still work in concurrent/distributed scenarios? And what about reporting? Your report engine might require plain old resultsets, no persistence objects.

All those benefits the architects expected - they just don't turn out that way. "Too much magic", as one of our consultants expressed it. There must be a reason why accessing relations databases is done in SQL. It's just coherent. There is no OO equivalent, that fits. There is no silver bullet.

Now, there are scales of grey, just as there are application scenarios, where O/R mappers do make sense. I recommend considering an O/R mapper if

(1) You have full control over the database design (no old legacy database).
(2) Load and concurrency tests prove that the O/R mapper works in a production scenario.
(3) Your customer favors development speed over future adaptability.
(4) The O/R Mapper supports SQL execution (or a similar kind of query language).
(5) The O/R Mapper is a proven product, and not the pet project of an inhouse architect.

It is also important to distinguish standalone O/R mappers from container-controlled persistence (the later were designed for running on a middle tier). J2EE Container-Managed-Persistence Entity Beans do make sense in a couple of scenarios (and then again they don't make sense in a lot of others, and J2EE architects will rarely ever recommend a 100% CMP EJB approach). And of course, Hibernate and others do a pretty good job as well on N-tier systems.
Summing up, I strongly agree with Clemens Vasters in most of the cases. He states:

I claim that the benefits of explicit mapping exceed those of automatic O/R mapping by far. There's more to code in the beginning, that's pretty much all that speaks for O/R. I've wasted 1 1/2 years on an O/R mapping infrastructure that did everything from clever data retrieval to smartt [sic] caching and we always came back to the simple fact that "just code the damn thing" yields far superior, more manageable and maintainable results.

Sunday, November 21, 2004

Tuesday, November 16, 2004

Life After Microsoft

Tomorrow Wednesday, October 17th, 7:00pm, the Upper-Austrian Workers Chamber will show "Life After Microsoft", a German TV documentation about former Microsoft employees, who suffer under serious burnout syndrome.

Subsequently there will be panel discussion. One panel member is the movie's director, Regina Schilling.

I have seen "Life After Microsoft" before. The Microsoft working ethics are quite demanding. Past achievements of long-time Microsofties don't count that much. You got to prove your commitment each day again.

On one hand, I would like to experience such working conditions. It must be very stimulating. On the other hand, it also sounds a little bit scary. I remember one of those Ex-Microsofties saying "I turned into a vegetable".

WSDL Binding Styles

IBM provides a good introductory document on different WSDL Binding Styles. The most common are RPC/encoded and Document/literal. WS-I Basic Profile also recommends Document/literal, which seems to become the broadly accepted standard. This encoding style should actually be sufficient in most of the cases, plus it provides the possibility of validating SOAP messages against their XSD schemas.

Well, there are always people who know better, e.g. certain government agencies that provide webservices, and somehow decided they had to use the completely uncommon RPC/literal binding style. This means that

(a) .NET 1.0 / 1.1 clients cannot access their webservice, as the .NET framework webservice implementation does not support RPC/literal. Admitted, there is a workaround, but that's definitely not something for the average programmer who drags and drops the webservice's reference into Visual Studio.
(b) SOAP messages are cluttered with unnecessary type encoding information on each request/response.

Of course those are the same folks that publish hand-coded (hence errorneous) WSDL-files.

Thursday, November 11, 2004

GETFIREFOX.COM

Recognize me by looking out for someone wearing one of those t-shirts.

Monday, November 08, 2004

Journey To The Past (14): Enterprise Applications (2002-today)

By the end of 2002 I returned to the multi-tier world, working on various enterprise application projects, mainly under J2EE resp .NET and .NET Enterprise Services. I share my time among consulting services, project management and programming.

Sunday, November 07, 2004

Journey To The Past (13): Wireless Systems (2001-2002)

Programming mobile phones was a real adventure. The segmented memory model of the 16-bit architecture implied just like the same 8086 / 80286 constraints from ten years before. Despite the restriction on system resources (which made me gain valuable know-how, even for my current work back in the client/server resp. multi-tier world), the development and debugging environments for embedded devices are something that takes getting used to. I was involved in several customer projects, mainly implementing man-machine-interfaces and sometimes even low-level device drivers (e.g. for the Samsung SGH-A500 and Asus J100 phones).



In the old days, common practice was to rewrite whole applications from the scratch for each new phone, depending on the underlying device drivers. We tried to improve that ponderous approach by building a C++ framework for mobile phone applications, which would encapsulate device specifics and provide a feature-rich API.



As one of the senior programmers I was in charge of laying some of the framework's groundwork (GUI, non-preemptive scheduler, API design and similar topics), and I also wrote several tools that completed our developer workbench, e.g. a phone emulation environment for Windows, graphic and font conversion programs, and a language resource editor. I was also managing a Java 2 MicroEdition port project.

Saturday, November 06, 2004

Journey To The Past (12): Generic and Dynamic Hypertexts (2000-2001)

The topic of my second thesis was the implementation of a tool for generic hypertext creation in Java. The resulting system was cleanly designed from the beginning until the end, and still pleases my somewhat higher standards of today. I developed the editor (MVC-approach using observer mechanisms for loose coupling), including features like HTML content editing for graph nodes, an automation engine for building concrete hypertexts from generic templates, and a hypertext runtime implemented using Java Servlets.


Friday, November 05, 2004

Journey To The Past (11): PC Banking (1999-2001)

Our department was also developing and maintaining a PC banking system. For several years, the frontend used to be a 16bit Windows / MFC application. I was leading a developer team that ported the old client to Java, utilizing Swing, JDBC (Sybase SQLAnywhere), and an in-house application framework.



Besides the complex business logic, and the need for a RDBMS-to-OOP data mapping, realizing a powerful graphical user-interface in Java was the most difficult challenge. We had to build several more complex controls on our own, like data grids and navigation panels. The client runs under Windows as well as under Linux and MacOS.

The PC banking application is mainly aimed for business customers. It reached an installation base of about 5.000 companies at the end of 2002, while another 50.000 companies were expected to upgrade within the following 12 months.

Journey To The Past (10): Visual Chat (1998)

My course curriculum for computer science included an assignment for a medium-scale project. I decided to build a chat program, which would let the user move around in a 3D world, and meet people at different locations. For several reasons I abandoned the idea of using standard protocols like IRC or VRML. Instead, I ramped up the complete solution (server and client) in Java, employing my own communication protocol and 3D engine. The client actually runs inside any internet browser, without the need to install any additional software. This was a very valuable experience, and included issues like threading, synchronization and networking. This was my first pure OOP project (disregarding the previous Smalltalk experience), and I remember it as a great field for experimenting, without tight schedules or sealed specifications. Of course when I look at the old code today, I can clearly notice that I was still lacking some experience back then.



Anyway, the size of Visual Chat's user community has risen steadily to more than 150.000 by the end of 2002. There are dozens of other Visual Chat Server installations on the internet today (it's freeware).

Thursday, November 04, 2004

Journey To The Past (9): Internet Banking and Brokering (1997-2001)

Internet banking was still in its infancy at the beginning of 1997. We designed and developed the internet banking and brokering solution for one of the largest Austrian bank groups. Our system consisted of a Java applet as a frontend, the business logic was running on Sun Solaris servers, and we integrated bank legacy systems like Oracle databases on Digital VAX, or CICS transactions on IBM mainframes. Later on, I was project lead for porting our solution for several other regional banks.





Over the years we replaced the proprietary middleware by a Java 2 Enterprise Edition application server, and kept session state on the server which allowed for stateless clients. Java Server Pages would now produce HTML for client browsers resp. WML for WAP mobile phones.



At this point in time, 500.000 people have subscribed to the Internet banking and brokering service, accounting for an average of 100.000 logins per day.

Besides that, I was also responsible for the implementation of an internet ticketing service, including online payment.

Journey To The Past (8): Database Application Programming (1993-1996)

4th Generation Tools were specifically en-vogue those days, and heavily applied at university, mainly for their prototyping capabilities. I developed a 4th Dimension application on the Apple Macintosh, which helped university researchers to enter survey data about companies' information technology infrastructure, calculated statistical indices based on alterable formulas, and finally brought up some reports, with all kind of pie and bar charts. I also employed 4th Dimension for my business informatics diploma thesis, when I implemented an information system for university institutes (managing employees, students and course data, automated course enrollment).



We used SQL Windows for building a prototype for a bookstore database application under Window 3.1, which came with a nice multiple document interface. Another prototype was done for a room resource planning system at university. And I did several freelance projects, mainly on MS Access, e.g. a customer relation management and billing system for a local media company. We installed this application in a Windows for Workgroups / LAN-Manager multi-user environment. It is still in use today.





My Access knowledge would also help me later on, during military service, when I could spend two out of eight months inside a warm office (while my comrades were being drilled outside in cold Austrian winter), implementing a database solution for the Airforce Outpatient Department.

Wednesday, November 03, 2004

Journey To The Past (7): Object Oriented Programming (1992)

Getting to know object oriented programming was a major turning point. The new paradigm was overwhelming, and Smalltalk really enforced pure object-orientation. I spent long hacking nights at the university lab getting to know the Visualworks Smalltalk framework on an Apple Macintosh II, and somehow managed to hand in the final project on time: a graphical calendar application.

Tuesday, November 02, 2004

Journey To The Past (6): IBM 3090 Mainframe (1991)

Mainframe programming under MVS and PL/1 was exciting at first, it felt like having stepping into the "big world" of business application development. Everyday experience turned out to be less fun at the end, when batch-jobs were slowed down by my senior university colleagues, who used to play multi-user dungeon games, and each program printout implied waiting for the next morning until I would finally receive it.

Monday, November 01, 2004

Journey To The Past (5): The Age Of Atari (1988-1992)

I really fell in love with my first Atari. It was a 520ST, equipped with 512KB RAM (upgraded to 1MB for a tremendous amount of money as soon as I received payment from a summer job). GFA Basic was a mighty language. This was my introduction to GUI programming (Digital Research's GEM, an early Macintosh look-alike). One could invoke inline assembler code, so I bought a book about 68K assembler, and finally managed to run some performance-critical stuff in native mode.



But I also felt the lack of support for modules and data encapsulation in Basic, so I decided to learn C (sounds like a contradiction today, but hey, this was 1988) using Borland Turbo C, which came along with a great graphical development environment for GEM. Phoenix on the other hand was a relational database system, that shipped with a very nice IDE. I learned about relational database modeling, and implemented some simple database applications.



Modula2 was the language of choice at my first university courses, but that wasn't too much of a change from the old Pascal days. Luckily, Modula2 compilers existed for the Atari ST as well, so I didn't have to spend my time at the always crowded university lab in front of those Apple Macs with 9-inch monitors.

In 1991 I purchased Atari's next generation workstation, the Atari TT-030. It was equipped with a Motorola 68030 processor running at 32Mhz, a 80MB HD and 8MB RAM.



But Atari did not manage to make the TT a winner, while the PC was gaining more and more market share. Notwithstanding all sentimental restraints, I finally bought a 486-DX2 in 1993.