Ealier today, selsinork wrote in (new one here owing to my change of topic):
selsinork wrote:
Interestingly, it's not so long ago that x86 systems were far less capable in terms of memory, storage and likely CPU than today's ARM systems, yet were perfectly capable of compiling everything required natively. How quickly we forget, and how quickly we bloat the software with unnecessary junk to the point this is no longer true.
That is so very true!!!
Software technology needs occasional revolutions too, just like society, to remove layers of accreted crud. This happens so rarely in computing alas that we're up to our eyeballs in it and barely keeping our heads clear. It is so rare that I'd like to highlight one relatively recent instance of it, although it is admittedly a revolution with a small 'r' as it hasn't captured much mindshare in the computing industry. It's the programming language Google Go.
It's not the Go language syntax nor semantics that are revolutionary, although it is nice enough in both respects and has lots of very effective features especially for concurrent programming. What is revolutionary about it is its implementation meme, which renouces the idea that building software is divorced from language definition and is performed by third party tools ("make" and its many equivalents in the commandline and IDE worlds) with limited knowledge of the language beyond syntax. That almost universally prevalent idea has slowed down system building to a crawl right across the world of computing, and it's almost single-handedly reponsible for the problem you described. (I'm referring to build time only here.)
Instead, Go provides integrated build tools and a software construction meme in which to compile a module requires looking only at its immediate dependencies, no further. In other words, if module A is dependent on B, and B is dependent on C, then to compile A does not require looking at C. (No explosion of header file lookups.) Extrapolate this to complex systems with many layers of dependency in breadth and in which each dependency is itself dependent on multiple layers of dependency in depth, and as you can imagine, Go application building can be literally orders of magnitude faster than in most of the languages in common use today. Many compiles are pretty much instantaneous despite having lots of dependencies.
That's "revolutionary" of the kind that removes decades of accumulated crud which has slowed down our systems to a crawl. It's worth spreading the word about it. Oh, and it works on ARM, not as efficiently as on x86 yet but it's improving all the time. I think we should be using Go very widely for as much as possible outside the kernel.
Just today on one of my BBB's with Debian installed:
debian@bbb:# apt-get install golang
...
Need to get 21.9 MB of archives.
After this operation, 81.4 MB of additional disk space will be used....
debian@bbb:$ go version
go version go1.0.2
debian@bbb:$ vi hello.godebian@bbb:$ cat hello.go
package main
import "fmt"
func main() {
fmt.Println("Hello, World!")
}debian@bbb:$ go build hello.go
debian@bbb:$ ./hello
Hello, World!
debian@bbb:$
More seriously, it still has a long way to go in a few areas (integration/interoperation is one), but I see very good things ahead for Go. I recommend that engineers add it to their toolbox and gradually expand their use over time. Expect huge reductions in crud / time gains on large projects.
Morgaine.

