That, sir, was inspired.
I wanted to flame you and tell you that you should have used "%08x" %
-1, which would obviously do the right thing, but funnily it doesn't,
printing lame "-0000001" instead.
What kinda made me agree with your point, as sprintf("%08x", -1)
definitely works in Ruby, doing exactly what it looks like it's doing,
and is also portable.
Yep, in Ruby 1.8.6, '%x'%-1 returns '..f', which is probably the best
thing you can do β 2's complement, and doesn't depend on any particular
data type size.
The thing that gets me in Python is that it worked passably. That
they broke it is annoying, that they did it for the completely wrong
reasons doesn't make it any better.
hiffy is right. An inspired rant!
And as long as we're on the subject of teaching kids to program
(though I fully agree with you about it not being necessary, as about
teaching 'em to convert between bases), the (in)famous Rubyist _why is
working on a cute programming environment for beginners called Hackety
Hack, which is worth looking at, I think.
Also, thanks for the C++ FQA! It's great to finally have a good site
to link to when I want to save my breath explaining the evils of C++ to
some benighted colleagues.
Regarding the "inspired" bit β um, thanks, it was primarily inspired
by overtime...
As to C++ FQA β I hope it's any good as persuasion ammunition. It
turns out I can't test it myself very much, because when someone asks
you, "what's wrong with constructors?", and you say, "read this page I
wrote about it", most of their attention gets occupied by the thought:
"hey, this guy really IS a nerd". Oh well. Maybe it works better when
I'm an anonymous third party in the argument.
Interesting rant, and given that you picked up only on minor warts, I
am tempted to say that you like Python. :-)
However, I think you are overemphasizing a bit the "teaching
language" roots of Python. It is true that ABC (Python's direct ascendant) was originally for
teaching, but this link between the two languages was cut long ago. I
never yet seen the addition of a language feature being refused (or
accepted) on the basis it would make the language harder (or easier) to
learn for the kids.
And about the "true division" thingy, I can't tell whether this was a
good thing or not. The only thing I can said it that changing the division operator was controversial and
not an easy decision to take.
I do like Python, in the sense that I think it does much less harm
than good. Basically the quality scale for programming languages is:
1. So bad that you must rewrite your legacy code in something else
ASAP (COBOL? Assembly? Maybe not. Maybe there are no languages which are
that bad.)
2. Too bad for new code, not bad enough to justify rewriting existing
code (C++)
3. Good enough to write new code in, and good enough to be chosen as the
language for new projects for availability reasons (for example, if
everybody around speaks Python, it's a good enough reason to use it and
not Ruby, even though the latter is apparently somewhat better
linguistically).
4. So wonderful/easy to learn that you want to use it for new code even
though it isn't widely spoken in your environment and has other
availability problems (Lisp? D?)
Most popular languages happen to fall between 2 and 3, and Python
gets a solid 3 in my book, and 3.5 in a 3GL environment when you want to
advocate a 4GL. So yeah, I like Python.
Personally, I haven't done much programming in C++ to judge it
(although, I do find your rants and FQA about it much entertaining). One
thing I find funny though, is that almost every year it's a C++ team
that wins the ICFP programming contest. For example, last year Team
Smartass (a team of Googlers) won the first prize and made the
organizers declare:
C++ is the programming language of choice for
discriminating hackers.
And, the second prize was won by another team of C++ coders (United
Coding Team), but they made the organizers declare:
Perl is a fine tool for many applications.
That is pretty ironic given that the contest's main goal is to raise
awareness in functional programming languages. The write-up of the contest is quite interesting, too.
Hopefully, this year
contest will be more challenging than "code a VM".
I don't know why people always compare Ruby to Python, and vice
versa. Ruby's design philosophy is much closer to Perl's than Python's.
Rubyists, just like Perl hackers, aim for a syntactically powerful
language at the cost of ambiguity. Pythonistas, on the other hand, aim
for readability at the cost of some rigidness in the grammar (e.g.
Python will probably never move beyond the current extended LL(1)
parser, where Ruby needs a quite complex LALR(1) parser). That's where
perhaps your analogy with kids language is somehow true.
I agree that Lisp is a nice and powerful language. However, I think
the community has severe problems (especially when they come to
marketing the language), which makes the language a bad choice for
larger projects. However for small project done with friends who knows
the languages well, Lisp is certainly a great choice (a fun one,
too).
Anyway, I am really intrigued about D. I have heard many clever
people advocating it as an alternative to C++, but I had never yet tried
it. So, I took a few hours today, installed both DMD and GDC and read
some of documentation posted on D website.
So far, I like it! It is an interesting blend between simple C and
high-level scripting languages. Although the libraries seems a bit
immature (yet quite complete for a young language). I will probably try
give it a better "test-drive" next weekend (I am thinking implementing a
small interpreter with it).
P.S. I remarked that you're using an old version of WordPress. In my
humble opinion, that is a bad idea. About a month ago, some attacker
successfully exploited a PHP injection vulnerability in WordPress 2.3
and gained shell access to my server account. Thankfully, the attacked
didn't do much damage β i.e., he simply appended some spam junk to all
my HTML pages, which I cleaned easily with one sed
command.
"Team Smartass": damn, I really envy those people. Winning a
competition as a member of Team Smartass. Is that awesome or what?
I don't follow these competitions, because of not liking programming
competitions and because of not caring that much about FP. The reason
that mainstream non-FP languages frequently win those competitions is
IMO that the users of those languages have more practice. A C++ loop
iterating over a map is ugly, but I've typed it a gazillion times and I
can do it again really fast, because I use it all the time. A
professional prehistorical warrior with a club has a good chance to take
out an amateur shotgun enthusiast. Miss a few times and you'll be
clubbed to death.
I don't think Ruby is that close to Perl, except for the occasional
dollar sign. I think Ruby and Python are the 2 top 4GLs if your metric
is popularity * cleanliness, and they use very similar dynamic
hashtable-based REPL models, so it's natural to compare them. I think
Perl 5 is way more complex syntactically than either of them. Regarding
parsing β I don't care whether it's LL or LR, as long as it's context
free (C++ has significantly lowered my standards for "ease of parsing").
I don't think Ruby tries to do any "DWIM" parsing similarly to Perl β or
does it? Which ambiguities are you talking about?
D rules.
Finally surrendered to my growing paranoia and upgraded to WP 2.5.1,
surprisingly painlessly.
1. Why the fuck should kids program?
Because programming can be a fun, creative task that at the same time
teaches them disciplined logical reasoning skills. And some kids will
like making their own Tetris clone.
2. Why the fuck should I program in a language for kids?
Python is great because it has a gentle learning curve (making it a
great first language) while at the same time scales up to be a
professional language in most software domains.
I really advocate Python, as a language for kids, non-programmers,
and programmers. It has the things I like about Perl without the things
I don't like about Perl (mostly). In fact, I wrote a book (Invent Your
Own Computer Games with Python) specifically to teach kids programming
in Python (can we agree that it's the best real-world language for that
task at least?)
I'll take on the two flaws you bring up about Python. I think the
hex() function should behave the way it does. It simply has the job of
converting integers to hexadecimal. Two's complement is an idea that is
beyond a mere base conversion. It would bring up all sorts of overflow
problems and architecture-specific concerns (32-bit two's complement or
64-bit?)
I understand your gripe, but it seems that you are upset that a
design trade off was made against your favor. I don't think that's a
fatal flaw for a language.
The division thing has always been a clusterfuck. I forget which
video, but I recall Guido van Rosem saying he regretted the whole
integer division thing, which is probably why in int/int = float in
Python 3000.
But I still don't think it is too complicated. Explain it as such: if
you divide 13 by 5, you get 2 with a remainder of 3. Integer division
gives you the 2 part, mod gives you the 3 part. To get real division,
convert an operand to a float. Sure, that's dumb (hence the fix in
Python 3), but again, I don't see that as a fatal flaw in the
language.
Let's put it this way: I sincerely like Python, I think kids should
program if they wanna, and the whole thing about hex and division isn't
a big deal.
I still say that base conversion is a useless operation while 2's
complement hex is useful. And if base conversion is considered
interesting, add a base() function for it, but why break hex() that did
the useful thing for lots of time?
Let's agree that Python does have an annoying vibe of teaching you
the right thing. Consider the interactive behavior of exit or quit. The
damn things aren't strings anymore; they are objects with overridden
__repr__ telling you that you should type Ctrl-D!! WTF? Why not
sys.exit(0) instead of fucking my brain?
On the other hand, it doesn't matter. For example, I like the
attitude behind Perl more than the attitude behind Python (of course I
refer to those attitudes as I perceive them). But I prefer to use Python
rather than Perl. Creators and creations aren't trivially "similar".
> chunks=len(arr)/n; rem=len(arr)%n
Ugh. Why not
chunks, rem = divmod(len(arr), n)
?
I think that still does int division in python 3.
Thing is we have a language for kids. It's called BASIC. It's what I
learned on when I was 8, got tired of when I was 12 and moved on to
scarier languages derived from C. Now I'm moving on to LISP derived
languages.
To paraphrase,
knowledge of the intricacies of C++ (and similar) should be treated like
a martial art. First try to run away. Failing that, call the
authorities. Your last resort should be to stand your ground and
code.
I think the old Dartmouth BASIC number is a really good starting
point, as is learning assembler on a simple, old architecture like 6502.
It's all about totally understanding all there is to know about an
environment, then using those tools to analyze a problem, and
experimenting with finding the solution that is verifiable, unlike your
English or social science paper.
If they like it, most kids will move on to other languages, but they
will have a basic understanding that is hard to gain from more
sophisticated languages with large libraries. Sort of like doing math
problems in your head and long hand versus a calculator.
I think assembly is a bad way to model computation, whether you're a
beginner or not, both because of its verbosity (as in function
calls/definitions or intermediate results of arithmetic expressions) and
its fixed-size limits (as in the number and to a lesser extent size of
registers.) IMO many a beginner would take away a feeling that programs
are verbose legalese. A group of beginners would love assembler as their
first language because they're into making watches out of a thousand
tiny moving parts; but those people will find their way to assembly or
similar anyway.
i'm just starting to learn py, so i can still clearly remember that
my brain rejected -0Γ1. the int div scared the hell out of me, &
half the hell when i knew about //
to the writer of "Invent your own computer games with python", i
don't give a fuck about whether they're complicated / not, what i
complain is they made my brain created new connections between my brain
cells, instead of reinforced existing ones. so as neuroscience goes,
using them weakens the network that let me know that -1 is 0x(f) in
hex
'Why the fuck should I program in a language for kids?'. this shakes
my pride & stomach
Loved the rant.
Are you this
hauptmech? Interesting stuff.
I want to like python. I had hoped to use it for all my new cgi
scripts but it took me forever to get used to the indentation.
I was and still am used to curly braces. I still prefer plain old C to
any of the new so called "easy" languages.
Well, if you prefer plain old C for CGI scripts (or should I say
programs), this leaves me speechless (I've seen people who did
everything including "scripting" in C++, also people who did everything
in C+bash+grep+awk+...; everything in C I hadn't seen).
I am currently teaching a middle school child to program in Palo
Alto, CA. The standard progression of languages in our area for children
is MIT Scratch, CMU Alice, then Java or Python. Many children are
learning Java for historical reasons. For the children learning Python,
many take advantage of Pygame, which provides 2D graphics based on
SDL.
My general view is that technology and art are merging. I don't think
we are teaching children programming for them to be professional
programmers. I believe we are teaching them programming so that they are
better prepared to express their creativity in a world that is becoming
increasingly influenced by technology.
Let's say that my students grow up to be environmental scientists or
pastry chefs. I believe that an understanding of programming will help
them in both of these jobs. Waiting to 17 years old is too late. By this
age, the children are too busy preparing for college and may not have
time to experiment with learning things on their own. I want the
children to be quite proficient in programming by this age so that they
can then go back to studies such as literature or art.
For now, I teach the creation of simple arcade-style games, trying to
unlock the creative potential of children with sound, music, 2D
graphics, lists, for loops, and coordinate arithmetic.
In the same way that we teach math without any expectation that the
child will become a professional math professor, we should not teach
programming to children with the intent of training professional
computer programmers.
Our goal here is to simply provide children the opportunity to make
their own choices. Without the tools of knowledge, opportunities may
close on them.
@Craig: it's definitely an inspiring approach. A bit sadly though, as
a professional programmer who occasionally does some painting and
sculpting, I was never able to interestingly "merge" programming and
art; nor was programming useful for anything other than earning a
salary, for that matter.
Personally, I explain this partly by my own interest in
meta-programming/platforms/etc. rather than end-user programming, which
obviously isn't applicable outside of a development context β and partly
on programs being a bit like movies: theoretically, you're just making a
bit string with readily available tools, but in practice, it's not at
all easy to produce a notable result on a small budget (both in terms of
time and money).
So if anyone successfully applies programming in the context of any
sort of art, I think it's rare and notable.
P.S. As to my rant β it was, well, a rant... So this is my answer to
you β but your comment wasn't "an answer to me" in the sense that your
comment was serious, this comment of mine is serious, too, but whatever
claims I made in the original rant probably aren't...
Started out inspired but it rants on. I could not agree more however.
The best language ever, and I really mean "Ever" was Pascal.
The compiler was unforgiving and did a magnificent job of protecting
the end consumers from (well you know...) IDIOTS. The final code was
elegant, efficient, fast and above all: Reliable. (Idiot programmers
need not apply)
There was absolutely nothing that Pascal could not do that "C" could,
but "C" was far more egalitarian and IDIOT tolerant, ergo its
popularity. C was also much less readable.
Then came the stupidest idea ever conceived since G.W.B. called
Object Oriented Programming, giving rise to C++ and Delphi Object
Pascal. Unfortunately, because OOP is necessary under GUI OS's such as
Windows, C++ became king and Delphi still has a following amongst Pascal
lovers such as my self.
The touted benefits of OOP = Code reuse, encapsulation and
inheritance are greatly over-rated.
1) Code re-use is available in all languages ... its called
cut-paste-and-tweek.
The problem with OOP, is that in order to make computer code re-use
friendly, you have to bloat its complexity. I just love searching
through a 1047 layer class hiearchy in pursuit of the proverbial
"beef"
2) Object inheritance: Ditto.
3) Encapsulation = PASCAL units and ADT's do this exceptionally well,
with far superior rules of scope, compared to C/C++
Unlike objects which are dynamic, pascal records and ADT
procedures/functions can be statically implemented
i.e. faster, more efficient and less likely to crash and burn at run
time due to buffer overflows, memory corruption, leaks and
fragmentation.
4) Poly morphism = no problem with PASCAL procedure/function
OverLoad.
5) Sophisticated and custom data types = No problem in pascal with
strong type checking, unless you specifically and deliberately re-cast
it.
6) Recursion = no Problem
7) Dynamic structures, no problem, without the dangerous pointer
arithmetic often abused by C
programmers.
8) Much more platform independent.
etc....
Like I said, there is absolutely nothing that Object Pascal/Delphi
cannot do, that C++ can, but the Pascal code is much more likely to be
stable.
β
Then came JAVA, which I call C++++++ for idiots.
And finally PYTHON ??!!!
Named after the proverbially stupid and maladjusted SULTAN of
RIDICULOUS ... Monty???!
β-
Python is a BASIC/LIKE Object Oriented kids' toy on steroids:
0) 100% OOP
1) Dynamically typed (only)
2) Indenting instead of {} or begin/end statements ?!
that may save a few chars, it it can really byte you
the ass later.
3) No type checking
4) Interpreted, no compiler ergo no compiler time checking.
5) No variable or type declarations !!!
6) Inefficient, slow, Buggy
Just because the code does NOT crash as often as C++ (because of
internal memory management) does not mean that the program actually
doing what is supposed to do in the first place.
Python, encourages the kind of mindless careless, goal oriented
programming typical of teenagers without methodology or formal
education. IT's great for writing phone-apps I suppose.
Now, to my horror, they are teaching this crap to my kid in first
year engineering.
The ratio of IDIOTS to HUMANS is fixed, but the damage
done by IDIOTS is exponential. Since the population is increasing, we
are doomed.
I agree that the current funcionality of bin() and hex() functions
are quite strange and useless. But this is only a really tiny minor
aspect of the language, so do not judge the entire thing based only on
this. There are third party library to deals with low-level data, so
bitching about that be in the python-core is just a waste of time.
About the integer division, dude, that is another minor change: now
you must use // for emphatizing floor division. No need for casting! And
I think is great, so that floor division is now a real operation, most
consistent than have to do tricks like 5/2. to get "correct" 2.5 float
answer.
I believe that programming is a really useful skill to have, like or
even more than math. And to make more brain-functional human beings, is
not bad at all to learn it early in life. But Python certainly isn't
just about that. It was, of course, designed to be clean and easy to
learn, but things evolved to something much bigger than that.
You people should realize that judging the whole language based only
in one or two minor aspects its like to defaming a person based on his
big nose.
I like Python; it was just a rant.
That said, Python 3 is IMO a train wreck, kind of, precisely because
of wishing to "do things right" where it isn't even that clear that the
new way is right.
Martin:
>The best language ever, and I really mean "Ever" was Pascal.
For 1990, it was great, yes (Delphi, too) :-)
>C was also much less readable.
Agreed. When I first saw C I thought it was some kind of a joke or
historical artifact. Turns out people still use it and it's not that bad
once you develop some habits in order to taper over (the many) problems
of the language.
>Unlike objects which are dynamic
Objects in C++ or Delphi are not dynamic. Objects in Smalltalk, LISP,
Python, Ruby, ... are.
The original idea of objects was (ask Alan Kay) that an object is
something you can pass a message (a string) to and it does what you told
it to. That's all.
Dynamic languages actually do it that way and you can actually write
a procedure that gets a string as argument which contains the message
passed (this procedure "is" the object). When you write obj.foo it will
call getattr(obj, "foo") automatically. When you write obj.bar it will
call getattr(obj, "bar").
C++ (somewhat on purpose) didn't do that (they wanted raw
performance, so there it's an integer offset, not a string) and so an
object in C++ and Delphi looks like a struct with "private variables in
the interface/header file" (not very private, is it?) which is the one
thing it wasn't meant to be (it's supposed to be a procedure, not a
struct).
About Python:
>0) 100% OOP
Yeah, and the weird struct kind, mostly. At least the REPL doesn't
print the member variables when you print the object, that would have
shown total ignorance.
In defense of Python, in Python 3 you can actually write dynamic
objects (i.e. procedures), too.
>1) Dynamically typed (only)
> 2) Indenting instead of {} or begin/end statements ?!
that may save a few chars, it it can really byte you
the ass later.
In the uncountable years I use it indentation didn't bite me in the
ass even once. This is a complaint of people who didn't try it.
Seriously, you are indenting anyway, in any language. Might as well skip
the braces.
(alternative: skip the indentation and keep the braces, i.e. write
the entire program in one line β not good)
(redundancy is not an alternative)
> 3) No type checking
I don't know where that myth comes from. If you were in the Python
REPL for 1 minute, you'd see it does type check. Python is strongly
typed.
>>> 1 + "hello"
TypeError: unsupported operand type(s) for +: 'int' and 'str'
> 4) Interpreted, no compiler ergo no compiler time checking.
There are compilers (Shedskin etc) and they do check at compile time
(try it, it has very annoying error messages, almost as annoying as
Pascal compilers').
> 5) No variable or type declarations !!!
Thank god. What are they for?
This is really an assembler mindset. I don't care how it stores the
values. Don't make me write int v = 5 like an idiot. What is 5? An
integer. So what is v? Yep. No need for boilerplate stating the
obvious.
That said, there are and always have been two schools of thinking and
dynamic language users are just different in that they can't abide
boilerplate for any reason.
If you want to write a formal proof, though, you need the boilerplate
β otherwise because of GΓΆdel's Incompleteness Theorem it's impossible to
prove what your program does without running it.
(I don't know a single programmer who writes a formal proof for all
his programs, though)
yosefk:
>"Kids" get the wrong answer: 3/2 should give 1 (honest-to-God
integer division)
That would be wrong (as in: different from any class in school I ever
had in my life).
>or it should give the ratio 3/2 (honest-to-God rational/real
number division).
I agree, that would be sensible.
>Floating point is reeeaaally too complicated.
It's fine as a compromise.
>"Programmers" get the wrong answer: chunks=len(arr)/n;
rem=len(arr)%n. 'nuff said.
chunks, rem = divmod(len(arr), n)
This is not C. You are allowed to have "multiple return values"
(*cough*).
Otherwise chunks=len(arr)//n; rem=len(arr)%n if you want to be
verbose (ugh).
>"Mathematicians" get the wrong answer: I mean, if you're doing
numeric work in Python (what's the matter, can't pay for a Matlab
license?),
I don't know your background but we physicists and mathematicians use
Python all the time (it's in the damn curriculum of the university!).
Half the push comes from us to make a computer actually compute, you
know, mathematical formulae without making mistakes like 1/2 == 0 (give
me a break...).
Projects like sympy (for symbolic computation) had to have weird
prolog initialisation which unbreaks things like this.
So it's (partly) our fault that you were angry. Hate us :-)
Though if you are programming a computer and not doing it for maths,
what exactly are you using it for?
>you probably know about integer division versus floating point
division
Know, yes. If any language ever bothers me with silent truncation, it
gets a black mark.
input()/input()
1
2
0
It better be kidding me...
>How am I supposed to do integer division? int(3/2)?
Yes. If you want to destroy information, state that you want to
destroy information. This is not PHP.
How you are supposed to do integer division: 3//2
Also, Python has one of the nicest upgrade paths I've ever seen in
any software product:
For any new feature, they wait for the next major version (it's a new
major version, all bets are off) in order to make it standard.
But they introduce it in the __future__ module some years (!) before
it becomes standard.
As for hex(), how did it know how many bits it should use for your
two's complement in the first place? Did it guess?
I didn't even know that Python uses two's complement for storing
integers (does it?).
Nitpick: typed into the Python 3 shell 3/2*2 does give the response
of 3.
I think the best programming language is forth. it only has one
syntax, the reverse polish notation. so no one is confused about how to
code for new version. there is no need for new version anyway. it's one
data type, the integer is a plus
Post a comment