Joel Spolsky on languages for web programming

David V. wrote:

trusting
what I know, more then just blanking trusting Jole.

And another point is that quite a few Ruby frameworks do come to
defining a domain-specific language in Ruby - cf. Og data definition,
Puppet, rake. There’s a (maybe not quite fine) line between a very
specific framework and a DSL that just gets crossed, and I don’t believe
rubyists are the innocents to throw the first stone.
That’s not quite the same - those DSL’s build upon a known and well
understood foundation, because they use Ruby’s syntax to their own ends.
I’m inferring from the very little information that’s out there that
Wasabi has its own parser, and that makes it a very, very different
beast to a DSL in the sense that I’ve come across the term in Ruby.

Alex Y. wrote:

I’m inferring from the very little information that’s out there that
Wasabi has its own parser, and that makes it a very, very different
beast to a DSL in the sense that I’ve come across the term in Ruby.

Some Wasabi info:

http://programming.reddit.com/info/g0fa/comments


James B.

“Simplicity of the language is not what matters, but
simplicity of use.”

  • Richard A. O’Keefe in squeak-dev mailing list

David V. wrote:

vice versa.
That’s the wrong argument to pick. Try calculating the full dynamics of
a modern metropolitan water supply network with just pen and paper.
Technological advances do move us from undoable to doable, and it’s
specific technologies that do it.

you seem to need 10 instead of 3 people and 5 times as long.

Pure, unadulterated shite. Give me numbers. Credible statistics and real
research, not random anectodal success stories that are too pathethic to
sell Herbalife diet pills.
I’m not going to address this - research on this level is heavily
funded, and heavily trend-driven. The answers you get depend too
heavily on what questions you ask.

Also, initial development cost isn’t a very important factor. Recalls
your uni software lifecycle charts about how much of a project’s life is
maintenance. For a successful project, the numbers are very much true.
With a successful product comes the responsibility of supporting it and
keeping it successful, and in some cases this responsibility creates
ongoing costs that dwarf the initial development horribly.
No argument there whatsoever.

Ok, sure Java’s OO may be nicer than Perl 5’s but once you brew
HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
isn’t exactly what I’d call pretty. Java is in no way a safe bet.

Noone cares about pretty. It’s also a completely irrelevant issue when
deciding on implementation language if you’re at least remotely
responsible.
Actually, pretty does matter. The comfort of a problem solver directly
impacts his/her approach to a problem. That’s just human nature.

Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That’s pure fact.
I remain unconvinced by this - and it’s mainly JIT optimisation that
keeps me on the fence. Dynamic optimisations can beat static - but not
in all cases. I believe this is what one calls an “open research”
question.

Chad P. wrote:

Regardless of how good or bad a decision a given language is for a
given task, Ruby is more likely to get you fired that Java.

To be fair, it’s not just corporate politics. Statistically, it’s more
likely a development house will have a strong base of Java developers or
C# developers (C#, while being very young and so far an abomination unto
Nuggan, is reasonably Java compatible), and that starting a Rails
project means you’ll probably have to get people with no Ruby experience
on the team, or create a burden on the company in case the original team
falls apart and quits to other companies regarding maintenance, or
whatever.

While the programming language decision might or might not have anything
to do with whether the project succeeds, choosing a locally unproven
language DOES make the project inherently higher-risk, and makes the
managers overall nervous - whence the likelihood of getting fired being
higher. It’s not punishment for your failure, it’s more for all the
other mess you could’ve caused even if the project succeeded, even if
the management might not be consciously aware of that.

How good a language or the frameworks for it are to initially develop
something in is not (maybe not by far) the most important factor when
making a decision.

David V.

Although I respect Joel very much, I believe he makes a fundamental
mistake in his reasoning.

Basically what he is saying can be deconstructed this way:

  • Do not risk developing in new cutting edge technology. Even if
    successful proof of concepts are already out there (37 signals et. al)
  • Use what most people use: PHP / J2EE / .Net not what most experts
    tell you to use. Communities and support are paramount.
  • Corporations and the people in those organizations favor safety, if
    your job is on the line go with the tried and true. Take no risks.

All three assumptions rely on a single assumption: FEAR.

  • Fear the technology would eventually not deliver.
  • Fear the support will not be sufficient.
  • Fear regarding your job safety as a corporate developer or manager
    who chooses Ruby or Ruby on Rails for some mission critical project.

All assumptions are wrong.

The only way significant progress is accomplished is precisely a
combination of: FAITH and COURAGE. That will make you stand out
anywhere.

The ideal place for those characteristics is inside a Startup or inside
of a bold, courageous corporation! It is not about the size of the
organization though, it is about the courage and boldness of the people
inside those companies.

People forget how the Internet, yes the OLD Internet was built. It was
done on new technology (www, http, mosaic, Perl), new development
models (open source, collaboration), new business objectives (community
first, users, and yes finally profits too.)

So, this is my take on this issue regarding Ruby and Ruby on Rails:

Do it, risk it, it’s worth it.

And the biggest advantage to Joel’s thinking for you would be that
neither he, nor corporations who thing like he does (most of them) will
be your competition. So when they do have some serious issues to
tackle, like that huge Framework called [insert-your-safe-choice-here]
trying to bent backwards to do what needs to be done fast… you will
have the last laugh.

Best Regards,

Jose L. Hurtado
Web D.
Toronto, Canada

David V. wrote:

How about C#, well it runs in Windows and without serious and expensive
firewalls you just can’t go anywhere near the Internet.

You need to tighten off Unix-based servers too. Heck, there are even
serious and expensive firewalls for Linux around too, because not
everyone has an in-house iptables guru.
But everybody should have a certified Cisco engineer if they use
Cisco routers, for example. It’s one of the costs of doing business.

Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That’s pure fact.

I’m not sure I agree with you here. First of all, while latent typing
may prevent you from optimizing (and I’m writing in Perl, not Ruby)

$j=0;
for ($k=0; $k<100000; $k++) {
$j++;
}

to

$j=$k=100000;

that kind of optimization is a trick used by compilers to get good
performance on trivial benchmarks, rather than something with a more
wide-ranging real-world payoff.

Second “compiled languages”, like Java, C#, C++ and even C have
extensive optimized run-time libraries to do all the lower-level things
that a “true optimizing compiler”, if such a thing existed, would do
automatically. Over the years, compilers have improved to the point
where they generate optimal code for things like LINPACK and the
Livermore Kernels.

In short, I don’t see why a Ruby interpreter and run time can’t
compete with a Java, C# or C++ compiler and run time! As long as you
have to have the same number of bits around to keep track of the
program’s data structures, objects, etc., “optimization” becomes a
matter of implementing the operations on the data structures
efficiently.

David V. wrote:

Noone cares about pretty. It’s also a completely irrelevant issue when
deciding on implementation language if you’re at least remotely
responsible.
Everyone cares about pretty. Taste for Makers

Pretty means understandable, maintainable, clean (and what the heck does
clean mean? reduced duplication?). Pretty means fewer LOC, which is
about the only objective measure of maintainability we know. (Cyclomatic
complexity being another, I suppose…) Pretty means fun, which we all
know means productive.

Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That’s pure fact.
Irrelevant. In many cases, the fact that Ruby has latent typing is an
implementation detail. Ruby has no type declarations, but in many
cases static type inference can be applied to get the same optimizations
of which Java and C# implementations avail themselves. (Disclaimer:
that’s about as much as I know about this subject.)

That’s not to say that I expect the current CRuby maintainers to add
such optimizations. They seem not to care, and that’s just fine by me.

Devin

Joseph wrote:

Although I respect Joel very much, I believe he makes a fundamental
mistake in his reasoning.

Joel is such a good writer that sometimes his jaw-drooping errors are
impossible to refute. (And don’t encourage him; he loves it when you
fight
back!)

Basically what he is saying can be deconstructed this way:

  • Do not risk developing in new cutting edge technology. Even if
    successful proof of concepts are already out there (37 signals et. al)
  • Use what most people use: PHP / J2EE / .Net not what most experts
    tell you to use. Communities and support are paramount.

The open source tools that succeed must have higher technical quality
than
the Daddy Warbucks tools. The latter can afford to buy their communities
and
“support” networks. Because an open source initiative cannot buy its
community and marketing, only the strong survive, and their early
adopters
will form this community spontaneously. They will provide the true
word-of-mouth advertising that marketing tends to simulate.

And I am sick and tired of seeing at shops dragged down by some idiotic
language choice made between the marketeers and a computer-illiterate
executive.

  • Corporations and the people in those organizations favor safety, if
    your job is on the line go with the tried and true. Take no risks.

Ah, so looking like you are following best practices is more important
than
doing everything you can to ensure success. Gotcha!

Yes, I have seen that upclose, too!

All three assumptions rely on a single assumption: FEAR.

  • Fear the technology would eventually not deliver.
  • Fear the support will not be sufficient.
  • Fear regarding your job safety as a corporate developer or manager
    who chooses Ruby or Ruby on Rails for some mission critical project.

Yup - that’s the Fear Uncertainty and Doubt formula that Microsoft
(among
others) use all the time. They have tried, over and over again, to FUD
Linux. Their CEO will get up on stage and say incredibly stupid things,
like
“if an open source platform fails you, there is nobody you can go to for
help!” He means there’s nobody you can sue. As if you could go to MS for
help, without paying thru the nose…

Oh, Joel is pro-Linux, right? What’s the difference??

All assumptions are wrong.

Better, fear that your boss will experience misguided fear.

On Mon, Sep 04, 2006 at 02:13:24AM +0900, Alex Y. wrote:

make, but it never single-handedly moves you from doable to undoable or
vice versa.
That’s the wrong argument to pick. Try calculating the full dynamics of
a modern metropolitan water supply network with just pen and paper.
Technological advances do move us from undoable to doable, and it’s
specific technologies that do it.

. . . and in any case, I don’t think anyone was saying Ruby was any kind
of guarantee of anything. The point is that Joel Spolsky’s
characterization of ultraconservative technology choices as necessarily
“right” is chaff and nonsense. Despite Joel’s usually intelligent and
well-reasoned commentary, he dropped the ball on this one, effectively
saying that Ruby is a guarantee of failure.

Bollocks, I say.

Also, initial development cost isn’t a very important factor. Recalls
your uni software lifecycle charts about how much of a project’s life is
maintenance. For a successful project, the numbers are very much true.
With a successful product comes the responsibility of supporting it and
keeping it successful, and in some cases this responsibility creates
ongoing costs that dwarf the initial development horribly.
No argument there whatsoever.

I have a caveat to add:

It’s true that initial development is often one of the cheaper parts of
a “successful” project, cost of initial development is still critically
important. If your initial development is too costly, you never get to
maintenance. Additionally, if you think middle managers think ahead
enough to just ignore initial development costs (even when they can
afford to do so) in favor of long-term cost savings, you probably
haven’t dealt with middle managers as much as I have. CxO-types are
even worse, because their job success metrics are more tied to quarterly
stock prices and market shares than anything more long-term (generally
speaking).

Ok, sure Java’s OO may be nicer than Perl 5’s but once you brew
HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
isn’t exactly what I’d call pretty. Java is in no way a safe bet.

Noone cares about pretty. It’s also a completely irrelevant issue when
deciding on implementation language if you’re at least remotely
responsible.
Actually, pretty does matter. The comfort of a problem solver directly
impacts his/her approach to a problem. That’s just human nature.

. . . and how much more do you think it costs in the long run to
maintain code that is a nasty, overly complex, ugly mess? Pretty
matters.

Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That’s pure fact.
I remain unconvinced by this - and it’s mainly JIT optimisation that
keeps me on the fence. Dynamic optimisations can beat static - but not
in all cases. I believe this is what one calls an “open research” question.

Unfortunately, JIT implementations haven’t been subjected to the same
long-term scrutiny and advancement as more traditional persistent binary
executable compiling implementations. As a result, I don’t think the
state of the art is there yet – leaving JIT implementations effectively
slower by nature until they get some more advancement over the years to
come. I really believe that gap will be closed rapidly in the near
future. Only time and experience will tell whether it can be made as
fast or faster, though I have no doubt that it can at least be made
close enough that most of us won’t care.

On Mon, Sep 04, 2006 at 12:18:30AM +0900, David V. wrote:

And another point is that quite a few Ruby frameworks do come to
defining a domain-specific language in Ruby - cf. Og data definition,
Puppet, rake. There’s a (maybe not quite fine) line between a very
specific framework and a DSL that just gets crossed, and I don’t believe
rubyists are the innocents to throw the first stone.

There’s a distinct difference between a subset of an already extant
language and an entirely separate language with its own idiomatic
syntax.

On Mon, Sep 04, 2006 at 12:29:26AM +0900, David V. wrote:

Nuggan, is reasonably Java compatible), and that starting a Rails
project means you’ll probably have to get people with no Ruby experience
on the team, or create a burden on the company in case the original team
falls apart and quits to other companies regarding maintenance, or whatever.

Choosing a language despite the resources at your disposal, rather than
because of them, would probably make that a “bad decision”. That in no
way invalidates the summarized point I already made:

“Regardless of how good or bad a decision a given language is for a
given task, Ruby is more likely to get you fired that Java.”

Alex Y. wrote:

beast to a DSL in the sense that I’ve come across the term in Ruby.
To use Martin F.'s terminology, there are external DSLs – a
language created for the domain and implemented with a parser, etc., in
some general-purpose language. And there are internal DSLs, written as
extensions/subsets inside a language like Ruby.

Rails and rake are internal DSLs, and Ruby makes internal DSL creation
much easier than many other languages. I can’t tell from this thread
whether Wasabi is external or internal.

I hardly think of an external DSL as anything special any more. They’ve
been around as long as I’ve been programming, which is – well, let’s
just say your toaster has more compute power than the machine I learned
on. :slight_smile: Almost every major decades-old Fortran code, for example, is
really implementing an external DSL.

On Mon, Sep 04, 2006 at 11:26:18AM +0900, M. Edward (Ed) Borasky wrote:

David V. wrote:

How about C#, well it runs in Windows and without serious and expensive
firewalls you just can’t go anywhere near the Internet.

You need to tighten off Unix-based servers too. Heck, there are even
serious and expensive firewalls for Linux around too, because not
everyone has an in-house iptables guru.
But everybody should have a certified Cisco engineer if they use
Cisco routers, for example. It’s one of the costs of doing business.

Frankly, iptables is easier to learn effectively than most proprietary
firewalls – and then there’s stuff like IPCop, which makes things even
easier.

Joseph wrote:

your job is on the line go with the tried and true. Take no risks.

All three assumptions rely on a single assumption: FEAR.

No. They rely on sound risk management principles.

  • Fear the technology would eventually not deliver.

Replace “Fear” with “Risk” and the above is reasonable if your company
does not have people experienced in a particular technology. And fact
is today it is still far harder to find people skilled at Ruby than
many other languages. More importantly, there is too little experience
with many Ruby technologies for a company with no Ruby experience to
know whether Ruby will be appropriate for a specific project.

  • Fear the support will not be sufficient.

Replace “Fear” with “Risk” again. The company I work for, Edgeio, uses
PHP for our frontend code (but Ruby for our backend) because when we
started building it I had concerns about the availability of people
skilled with Ruby in general or Rails in particular.

When we started hiring those concerns were validated: It’s proved
extremely hard to find people with Ruby experience. While it’s
certainly getting easier rapidly, not everyone can afford to take the
risk. In our case I decided to start phasing Ruby in for small self
contained components in our backend, and gradually take it from there
as we get enough Ruby skills through training or hiring, which has
proven to work well and meant that in the event that we’d run into
unforeseen problems, the effort involved in switching back to another
language would have been limited.

  • Fear regarding your job safety as a corporate developer or manager
    who chooses Ruby or Ruby on Rails for some mission critical project.

Which is very valid if you make a choice detrimental to the company,
regardless which language it involves. As much as I love working with
Ruby, if someone working for me picked it for something inappropriate,
and the project tanked badly, they certainly would have to take the
responsibility for the language choice. If you don’t have Ruby skills,
or your team doesn’t have sufficient Ruby skills, or there aren’t
enough skilled Ruby developers available in your location, picking it
for a high risk project will certainly not speak to your favor with any
risk

“Fear” as you say, or “risk” is an important decision factor for any
conscientious manager. Deciding what level of risk is appropriate for a
project vs. the potential payoffs is one of the most important skill a
manager must have to make good decisions.

They key is whether you/your team has or can easily aquire the skills
required to minimise the risks and maximise the payoff. For many teams
that will not be the case when dealing with any specific new tech.

As for “successfull proof of concepts”, they mean nothing unless a) you
have the same skills and resources as the company in question, and b)
your project is sufficiently similar. Which means most decisions about
technology tends to boil down to 1) what your team knows to a certain
degree, 2) which technologies are the most widely deployed. Ideally
you’re looking for an intersection.

Sometimes the payoff in trying a technology that your team is
inexperienced with or that isn’t widely deployed is large enough to
outweigh the risks, or the risks are mitigated by your teams experience
(in the case of tech that isn’t widely deployed) or by the available
pool of external experience (in the case where your team doesn’t have
the skills), but that is not a decision to take lightly.

I am all for using Ruby, and I think a lot of companies that aren’t
using Ruby could get great benefit from testing it. But on low impact,
low risk, simple projects first. Not because Ruby in itself is
inherently high risk, but because few companies have enough experience
with Ruby to jump right into using it on a large or high profile
project.

Vidar

On Sep 3, 2006, at 11:55 AM, James B. wrote:

Wasabi – Joel on Software
http://programming.reddit.com/info/g0fa/comments

This was interesting reading.

I’m paraphrasing here but Spolsky’s replies in the second link
basically indicate that he trusts his team and likes to take a few
risks. I would say that’s the reason to choose Ruby on Rails as an
answer to the original article-provoking question.

James Edward G. II

Vidar H. wrote:

This leads to an interesting question: how many ruby programmers are
there, anyway?

I ran across
http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
today and boggled at the “2.5 million” number for PHP.

Any ideas for Ruby?

Bill

Chad P. wrote:

Frankly, iptables is easier to learn effectively than most proprietary
firewalls – and then there’s stuff like IPCop, which makes things even
easier.

When there are certified iptables engineers, I’ll trust my business to
them. Until then, I’m sticking with Cisco and certified Cisco engineers.
When you post a job application for a sysadmin position, you’re going to
get at least ten times as many applicants as you need, so you can afford
to insist that they be certified by Cisco, Microsoft or Red Hat as
appropriate.

William G. wrote:

language would have been limited.

This leads to an interesting question: how many ruby programmers are
there, anyway?

I ran across
http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
today and boggled at the “2.5 million” number for PHP.

Minor correct: I ran across that, and then read the following in Mark
De Visser’s profile on LinkedIn.

Zend Technologies creates PHP products, software for rapid development
and deployment of Web applications. PHP is being increasingly adopted
with an estimated 2.5 million developers currently using it and 22
million
deployed websites.

Chad P. wrote:

long-term scrutiny and advancement as more traditional persistent binary
executable compiling implementations. As a result, I don’t think the
state of the art is there yet – leaving JIT implementations effectively
slower by nature until they get some more advancement over the years to
come. I really believe that gap will be closed rapidly in the near
future. Only time and experience will tell whether it can be made as
fast or faster, though I have no doubt that it can at least be made
close enough that most of us won’t care.

In the “good old days”, an assembly language programmer could turn out
code that was from 2 to 10 times as fast as that turned out by a
compiler, and a compiler could turn out code that was from 2 to 10 times
as fast as an interpreter.

The gap has narrowed. It’s rare that an assembly language coder can beat
a compiler by more than a factor of 2 these days, and on some
architectures it’s a dead tie – there’s only one way to do something
and the compiler always finds it. Interpreters are better now too,
mostly because today’s languages have such a large component that has to
be dealt with at run time anyway that the “heavy lifting” is done by
compiled code.

I’m not sure JIT is “necessary” for efficient interpretation of Ruby
anyway. But you’re right … if the economics is there, the gap will get
closed, just like the compiler/assembler gap got closed.

Alvin R. wrote:

Are you saying all languages yield the same level of productivity? If
they aren’t equally productive then how much more productive is Java
over C++ or VB over assembler? Do you need “credible statistics and
research” to answer the question?

He may not be saying all languages yield the same level of
productivity. But I’ll say something similar: the productivity of
programmers depends more on their knowledge of the application area and
their familiarity with the development environment than it does on the
environment and language.

There are tools that can drag down an otherwise productive team, but
they tend to get discarded fairly quickly.