Skip to main content

A Week with Go, Day 3

The first two days of tinkering and scouring helped me form an opinion of Go based on its syntax. To form a more-informed opinion I would have to write some more code and see how much resistance I experienced along the way. What features were missing? How was typing applied? I wrote a rudimentary version of Deal or No Deal, and slowly some of those meaningless sections in the language spec started taking on more meaning.

I decided to store the case amounts as an integer array (nobody likes the 0.01 anyway!) and came across my first bit of frustration and misunderstanding when it came time to shuffle the amounts.
cases = []int{100, 200, 300, 400, 500, 750, 1000,
    5000, 10000, 50000}
shuffle(cases)

func shuffle(arr []int) {
        rand.Seed(time.Nanoseconds())
        for i := len(arr) - 1; i > 0; i-- {
                j := rand.Intn(i)
                arr[i], arr[j] = arr[j], arr[i]
        }
}
The language spec says arrays are value types and slices are reference types. shuffle was doing what I wanted, but why was Go manipulating the array values if cases was supposed to be passed by value? I posted the question on Stack Overflow and it became clear that cases was a slice. I had written []int{} expecting the behavior of [...]int{}. Hopefully this will be the first and last time I make this mistake, but I have a feeling it won't be even with a correct understanding of what Go is doing. The syntax is just too similar. I don't understand why almost identical syntax was chosen for two different concepts here but different syntax was chosen for identical concepts with var = vs :=.

And I was lucky I only needed to shuffle a list of integer values; Go doesn't have generics, so I wouldn't be able to easily write a general-purpose shuffle function. It feels "dirty" to write shufflei(arr []int), shufflef(arr []float), etc. since the functions would all be identical except for their signatures!

The appeal of clean-looking code, novel looping, and an official formatting utility was waning because of issues and deficiencies more integral to the language and its implementation. Go is still nascent, but we shouldn't be revisiting these problems in a modern programming language.

Feel free to share your impressions of Go in the comments below and come back tomorrow for day 4. If you're interested, here's my Deal or No Deal code.

Comments

Popular posts from this blog

Composing Music with PHP

I’m not an expert on probability theory, artificial intelligence, and machine learning. And even my Music 201 class from years ago has been long forgotten. But if you’ll indulge me for the next 10 minutes, I think you’ll find that even just a little knowledge can yield impressive results if creatively woven together. I’d like to share with you how to teach PHP to compose music. Here’s an example: You’re looking at a melody generated by PHP. It’s not the most memorable, but it’s not unpleasant either. And surprisingly, the code to generate such sequences is rather brief. So what’s going on? The script calculates a probability map of melodic intervals and applies a Markov process to generate a new sequence. In friendlier terms, musical data is analyzed by a script to learn which intervals make up pleasing melodies. It then creates a new composition by selecting pitches based on the possibilities it’s observed. . Standing on Shoulders Composition doesn’t happen in a vacuum. Bach wa

Learning Prolog

I'm not quite sure exactly I was searching for, but somehow I serendipitously stumbled upon the site learnprolognow.org a few months ago. It's the home for an introductory Prolog programming course. Logic programming offers an interesting way to think about your problems; I've been doing so much procedural and object-oriented programming in the past decade that it really took effort to think at a higher level! I found the most interesting features to be definite clause grammars (DCG), and unification. Difference lists are very powerful and Prolog's DCG syntax makes it easy to work with them. Specifying a grammar such as: s(s(NP,VP)) --> np(NP,X,Y,subject), vp(VP,X,Y). np(np(DET,NBAR,PP),X,Y,_) --> det(DET,X), nbar(NBAR,X,Y), pp(PP). np(np(DET,NBAR),X,Y,_) --> det(DET,X), nbar(NBAR,X,Y). np(np(PRO),X,Y,Z) --> pro(PRO,X,Y,Z). vp(vp(V),X,Y) --> v(V,X,Y). vp(vp(V,NP),X,Y) --> v(V,X,Y), np(NP,_,_,object). nbar(nbar(JP),X,3) --> jp(JP,X). pp(pp(PREP,N

What's Wrong with OOP

Proponents of Object Oriented Programming feel the paradigm yields code that is better organized, easier to understand and maintain, and reusable. They view procedural programming code as unwieldy spaghetti and embrace OO-centric design patterns as the "right way" to do things. They argue objects are easier to grasp because they model how we view the world. If the popularity of languages like Java and C# is any indication, they may be right. But after almost 20 years of OOP in the mainstream, there's still a large portion of programmers who resist it. If objects truly model the way people think of things in the real world, then why do people have a hard time understanding and working in OOP? I suspect the problem might be the focus on objects instead of actions. If I may quote from Steve Yegge's Execution in the Kingdom of Nouns : Verbs in Javaland are responsible for all the work, but as they are held in contempt by all, no Verb is ever permitted to wander about