Sunday, March 23, 2008

A couple of not so obvious facts around REST/HTTP

While composing an entry on QCon I came across a couple of factoids around REST/HTTP that I had thought obvious but when I mentioned them at the event a few people found them surprising. So rather than bury them in that post (when it eventually appears), I thought I'd bring them up here:


  • I've been developing applications on the Web since it was first released: being at University at the time, I had a lot of freedom to play. I even wrote a browser in InterViews! (Anyone else remember gopher?) Anyway, I remember being glad when the first URN proposal came out because it looked to address some of the issues we mentioned at the time, through the definition of a specific set of name servers: no longer would you have to use URLs directly, but you'd use URNs and the infrastructure would look them up via the name server(s) for you. Sound familiar? Well fast forward 10 years and that never happened. Or did it? Well if you consider what a naming service (or trading service) does for you, WTF is Google or Yahoo?

  • My friend and co-InfoQ colleague/editor Stefan has another nice article on REST. In it he addresses some of the common mis-conceptions around REST, and specifically the perceived lack of pub/sub. You what? As he and I mentioned separately, it seems pretty obvious that RSS and Atom are the right approach in RESTland. The feedback I got at QCon the other week put this approach high on my pet projects list for this vacation, so I've been working on that for our ESB as well as some other stealth projects of my own.

Now the folks I met at QCon were all very bright. So their surprise at these "revelations" came as a bit of a surprise to me. But hey, maybe it wasn't a good statistical sample.

Monday, March 17, 2008

Beautiful Code

Just back from QCon London and taking the day off (another one of those "use 'em or lose 'em" days). I'll say more about QCon in a separate entry, but I wanted to mention something that came up there but which has been playing on my mind for a while anyway: the art of beautiful code and refactoring. I heard a number of people saying that you shouldn't touch a programming language if you can't (easily) refactor applications written using it. I've heard similar arguments before, which comes back to the IDEs available. I'd always taken this as more of a personal preference than any kind of Fundamental Law, and maybe that (personal preference) is how many people mean it. However, listening to some at QCon it's starting to border on the latter, which really started me thinking.

Maybe it's just me, but I've never consciously factored in the question "Can I refactor my code?" when choosing a language for a particular problem. I think that's because when I started using computers you only had batch processing (OK, when I really started we were using punch card and paper-tape, but let's gloss over that one). Time between submitting and compiling was typically half an hour, not including the 6 floors you had to descend (and subsequently ascend). So you tried to get your programs correct as quickly as possible, or developed very good calf muscles! Refactoring wasn't possible back then, but even if it was I don't think most of us would have bothered because of the batch system implications.

I try (and fail sometimes) to get the structure of my programs right at the start, so even today I typically don't make use of refactoring in my IDE. (Hey, it's only recently that I stopped using emacs as my de-facto editor, just to shut up others!) But this is where I came in: it's a personal thing. Your mileage may vary and whatever you need to do to help you get by is fine, surely? Why should it be the subject of yet another fierce industry battle? Are we really so short of things to do that we have to create these sorts of opportunities?

Oh well, time to take the day off.

Saturday, March 08, 2008

Distributed Java Project

While doing the project migration for C++SIM/JavaSim, I came across another old project of mine: a distributed Java framework. Back when Java was still Oak, there was no such thing as Java RMI. The kind of research we did in the Arjuna Project was all distributed in nature and we already had a C++ stub generator and Rajdoot RPC mechanism. So as the W3Objects work expanded (more on that in another entry), I took to implementing distributed Java. The system was interoperable with our C++ equivalent and generated client and server stubs based on C++ header files or Java interfaces. It was used in some of our research for a few years, but fell away as Java moved on and it became more of a chore to update. Ah ... those were the days.

C++SIM/JavaSim

Back in 1990 my friend Dan McCue and I were doing work on replica management and a way to compute the optimum number and location of replicas to achieve a desired level of availability. (Yes, availability is not necessarily proportional to the number of replicas.) We needed to do some simulation work and started out with Simula, which is a nice language but which neither of us had much experience at the time. Both of us were (are?) C++ die-hards, so we decided that the best way would be to build out own simulation toolkit in C++, and C++SIM was born.

C++SIM was very successful for us (thanks to Isi for helping with some of the statistical functions). It has been used in a number of academic and industrial settings. It was probably one of the early open source offerings too, since it was made freely available by the University. I learnt a lot from developing it, not least of which was multi-threaded programming: this was the age before the general availability of thread-aware languages and operating systems. Sun's Lightweight Process package in SunOS had been around for a few years and Posix Threads was still in its infancy. But when you want to run simulations on different operating systems, it was impossible to target the same thread package. So I wrote a thread abstraction layer for C++SIM, as well as a couple of threading packages (ah, setjmp/longjmp were my best friends back then).

In 1996 I ported C++SIM to Java, and JavaSim was born (I've never been that good with sexy names!) Because of the massive adoption around Java, JavaSim saw more uptake than C++SIM. It was also easier to implement and maintain than C++SIM. Again, over the intervening years it's had a lot of use and I'm still getting feedback from people asking for updates or just reporting how they are using it (them).

Now the problem was that their current homes were limiting. The source code repository changed several times and I didn't have direct access to maintain it. The web site was also outside of my control once I left the University. So I finally got agreement from them to move it outside and change the licence to something a bit more modern. I've been working on this shift for about 9 months (though it's really only taken me a couple of weeks to do), but JavaSim/C++SIM now have a new home in Codehaus. The move isn't quite complete (I still need to find the source for the docs), but it's a start.

JBossWorld recap

It's been a couple of weeks since I got back from JBossWorld Orlando. Enough time to blog, but not enough spare time to blog! So while waiting for the family to get ready so we can go to a three year old's birthday party (Hmmm, screaming kids ... fun!) I decided to grab some time and give a recap.

I've been to every JBossWorld bar the first one and I have to say that this one was the best (with the exception of the JBoss party, which was not a JBossWorld party at all - maybe a Red Hat party in disguise?) There were more people at the event and this was obvious in the sessions: every one I went to was packed, some with people sitting on the floors in the aisles! The quality of the sessions was also really good too.

Maybe it has something to do with the fact we missed a JBW last year and people were relieved to see that they are back, or maybe it was the fact we've made a lot of improvements to the technologies and processes over the past year or so. I don't have the answer, but I do know that the whole event was buzzing. When I go to conferences or workshops I usually find time to do some work (e.g., catching up on things I haven't had time to do over the previous weeks or months). Not this time: if I wasn't presenting or listening to presentations, I was talking to users, customers or friends/colleagues.

I think one of the highlights for me was my presentation on JBoss Transactions. I've done presentations on JBossTS for so long (going back decades if you count Arjuna), that I can usually predict the audience: a select number of die-hard transaction users who already "get it" and want to talk shop. Not this time. The room was packed (with people standing and sitting on the floor). Even more so than the presentation on JBossESB! So much so I had to ask the audience if they were all in the right room! Everyone stayed until the end (always a good sign) and there were lots of good questions and in depth discussions.

We made a lot of interesting announcements during the event and I got pulled into a few press and analyst meetings. I know that all of the JBoss/Red Hat folks were happy the event took place, but so were the people from outside the company. That definitely is the highlight for me. And of course it was good to see Marc there too. It wouldn't really be a JBossWorld without him.

Wednesday, March 05, 2008

Vista Woes

So far I've managed to avoid having to use Windows Vista. But I've heard the rumours of problems over the past 12 months. Given the hype that has surrounded Vista for the past few years, it's really disappointing to hear. But until now it was all hearsay. But we bought my son a new laptop recently and it came with Vista pre-installed. He's been using a 5 year old PIII running XP and now has a Dual Core 2Gig running Vista.

My initial impressions of Vista were that it looked good and felt fresh. But within an hour of using it both he and I were frustrated by the interface (WTF were they thinking of when they developed this?) and the speed: it's really slow! Now I know the machine itself is fast because we're running XP and Linux on the exact same configuration. So this sluggishness is purely down to the OS. After 2 months of trying to put up with it, I have to say that everything bad I've heard about Vista seems to be born out. I'm probably going to persevere with it for a while longer just in case MSFT get their act together, but I can see us nuking Vista and going to XP soon if things don't improve.