Property Map

On One Map seems to be the best UK property-maps mashup, and its sad to see the flooding data has been pulled thanks to a draconian copyright policy by the UK Government. There are other flood maps though.

DebConf7 Videos are up, including the Open Fonts session

The DebConf7 Videos are up!

Including the Open Fonts session!


At the ATypI 2005, I heard that Pyrus, the FontLab proprietors, make most of their money on their “photofont” tools, not the professional stuff. My friend Gustavo Ferreira made an awesome tool for bitmap font design, “Elementar,” but apart from that, I know nothing about them.

And then Pippin popped up on IM today with this cool photographic font and some thoughts about them.

Such as, that asset management will be the same for vector or bitmap fonts, bitmap fonts use of colour suggests vector fonts could also be coloured, and that glyph variation, ligatures and other “advanced” font features would be just as useful for photofonts.

Why would this be useful? It would “allow nice interaction with photographic backgrounds,” he said, and this reminded me of a page of tips for type on photography written by a freeware font designer.

LaunchPad will be the first decentralised Web App?

Canonical is often criticised for LaunchPad, a web based application for the development of Ubuntu, since it is typical of all web based applications - it maintains centralised control over its users, akin to the control proprietary software developers have over their users. This is the problem the Affero GPL aims to solve.

I’ve long thought the technical solution to this lies in peer-to-peer technology, and it appears Mark Shuttleworth is also thinking along those lines:

Mark Shuttleworth was interviewed by Tim O’Reilly. Mostly softballs. Shuttleworth did make an interesting comment to the effect that Launchpad is really a stopgap until we have federated development tracking systems that can coordinate activities across projects. This is something I’ve been thinking about a bit lately.

Moglen on SugarCRM using GPLv3

Eben Moglen published an article in BusinessWeek about SugarCRM using GPLv3 that pushes hard on the business-friendliness of software freedom, which is best produced by the GPLv3.

Richard Stallman on Web Based Applications

Part 2 of a Q&A with Richard Stallman explains concisely his views of web based applications, and implies why the Affero GPL is useful.

Part 1 explains what is wrong with Ubuntu and Debian.

And this is pretty obscure, but it appears Richard Stallman invented Wikipedia!

(You can play YouTube videos using free software using gnash)

The Future of GNU/Linux

The future of GNU/Linux seems to be integration with proprietary web based services. This was the main theme of the keynotes at GUADEC (RedHat’s Online Desktop, PyroDesktop, and seems to be on the agenda at UbuntuLive too.

I can only hope the Affero GPL will work for people making web based services that respects users freedom.

Sinking Tuvalu

The island nation of Tuvalu is slowly being drowned by global warming. Most people there expect to emigrate. It may be possible for 10,000 refugees to find a place to go. But if climate change sends millions fleeing, a few decades from now, will anyone let them in?

“The sea level is climbing by 5.6mm a year, twice the average global rate predicted by the UN’s International Panel on Climate Change (IPCC)” and “Australia and America and England don’t take notice of [the island nation] because we’re too small, and they want to keep their factories and cars.”

Interesting that England is percieved as one of the most harmful countries, when a Dutch government study found last month that China now tops the USA as the most harfmul polluting country - I suppose this is because those countries’ media establishments are the most widely disseminated.

But it suggests that when climate change destoys millions of people’s lives in the 2030s, England will be held to blame.

Better go visit it on a jet plane quick, then, I suppose…

Example: Open Source History Gets It All Wrong

I was doing some more software freedom history research today, and I found a small article titled “Open Source History” in Open Source Security Tools: Practical Guide to Security Applications as part of the Bruce Perens’ Open Source Series (you can download the whole book at no cost [PDF]).

Having read a lot of history stuff and discussed some of the points in heated debates online, I was amazed to read this: I could say that it is full of fabrications, but probably the author is just confused and lazy, not checking out sources.

The open source software movement has its roots in the birth of the UNIX platform, which is why many people associate open source with UNIX and Linux systems, even though the concept has spread to just about every other computer operating system available. UNIX was invented by Bell Labs, which was then the research division of AT&T. AT&T subsequently licensed the software to universities. Because AT&T was regulated, it wasn’t able to go into business selling UNIX, so it gave the universities the source code to the operating system, which was not normally done with commercial software. This was an afterthought, since AT&T didn’t really think there was much commercial value to it at the time.

Source code was the normal form for distributing software in the 60s and 70s because there were so many different hardware architectures, and the whole point of UNIX was that it was written in portable C that could be compiled on hundreds of different kinds of hardware.

Universities, being the breeding grounds for creative thought, immediately set about making their own additions and modifications to the original AT&T code. Some made only minor changes. Others, such as the University of California at Berkley, made so many modifications that they created a whole new branch of code. Soon the UNIX camp was split into two: the AT&T, or System V, code base used by many mainframe and minicomputer manufacturers, and the BSD code base, which spawned many of the BSD-based open source UNIX versions we have today. Linux was originally based on MINIX, a PC-based UNIX, which has System V roots.

The Linux kernel’s initial design was based on the Minix kernel, but it wasn’t based on its code.

The early open sourcers also had a philosophical split in the ranks. A programmer named Richard Stallman founded the Free Software Foundation (FSF), which advocated that all software should be open source.

Stallman advocates software freedom, not open source.

He developed a special license to provide for this called the General Public License (GPL). It offers authors some protection of their material from commercial exploitation, but still provides for the free transfer of the source code.

This confuses “proprietary” with “commercial,” the most common misunderstanding people have about software freedom.

In fact, the GPL offers authors the best possible protection from proprietary exploitation, but still provides for many commercial opportunities, including selling software.

Berkley had developed its own open source license earlier, the BSD license, which is less restrictive than the GPL and is used by the many BSD UNIX variants in the open source world.

The BSD license was developed later than the GPL, for the Networking Release 1 in 1989, and is more restrictive than the GPL for users, since it allows developers to remove software freedom.

These two licenses allowed programmers to fearlessly develop for the new UNIX platforms without worry of legal woes or having their work being used by another for commercial gain.

This is total nonsense. Neither license protects from legal woes because of software idea patents. The whole point of the GPL and BSD licenses is that everyone has freedom to use the software for commercial gain.

This brought about the development of many of the applications that we use today on the Internet, as well as the underlying tools you don’t hear as much about, such as the C++ compiler, Gcc, and many programming and scripting languages such as Python, Awk, Sed, Expect, and so on. However, open source didn’t really get its boost until the Internet came to prominence in the early 1990s. Before then, developers had to rely on dial-up networks and Bulletin Board Systems (BBSs) to communicate and transfer files back and forth. Networks such as USENET and DALnet sprung up to facilitate these many specialized forums. However, it was difficult and expensive to use these networks, and they often didn’t cross international boundaries because of the high costs of dialing up to the BBSs.

USENET (original name for newsgroups) an DALnet (an IRC network) are Internet services, not BBS ones.

The rise of the Internet changed all that. The combination of low-cost global communications and the ease of accessing information through Web pages caused a renaissance of innovation and development in the open source world. Now programmers could collaborate instantly and put up Web sites detailing their work that anyone in the world could easily find using search engines. Projects working on parallel paths merged their resources and combined forces. Other splinter groups spun off from larger ones, confident that they could now find support for their endeavors. It was from this fertile field that open source’s largest success to date grew. Linus Torvalds was a struggling Finnish college student who had a knack for fiddling with his PC. He wanted to run a version of UNIX on it since that is what he used at the university. He bought MINIX, which was a simplified PC version of the UNIX operating system. He was frustrated by the limitations in MINIX, particularly in the area of terminal emulation, since he needed to connect to the school to do his work. So what became the fastest growing operating system in history started out as a project to create a terminal emulation program for his PC. By the time he finished with his program and posted it to some USENET news groups, people began suggesting add-ons and improvements. At that point, the nucleus of what is today a multinational effort, thousands of people strong, was formed. Within six months he had a bare-bones operating system.

Within six months he had a bare-bones kernel, which together with the GNU operating system developed over the previous six years formed an operating system…

It didn’t do much, but with dozens of programmers contributing to the body of code, it didn’t take long for this “science project” to turn into what we know as the open source operating system called Linux.

…which is why the operating system is called GNU+Linux or GNU/Linux.

Linux is a testament to all that is good about open source. It starts with someone wanting to improve on something that already exists or create something totally new. If it is any good, momentum picks up and pretty soon you have something that would take a commercial company years and millions of dollars to create. Yet it didn’t cost a dime (unless you count the thousands of hours invested).

Most large free software projects have some kind of commercial company backing, and the significant developments of the operating system (as opposed to the applications) have all been done by free software company employees on company time since the mid 1990s.

Because of this, it can be offered free of charge.

It can be offered free of charge because, sometimes, that is a friendly thing to do. At other times, it can be distributed for a fee, because people like to exchange money for value, and freedom is valuable.

This allows it to spread even farther and attract even more developers. And the cycle continues. It is a true meritocracy, where only the good code and good programs survive. However, this is not to say that there is no commercial motive or opportunity in open source. Linus himself has made quite a bit of money by his efforts, though he would be the first to tell you that was never his intention.

That may not have been Linus’ intention when he was an undergraduate student, because undergraduate students are rarely intent on making lots of money at that time.

But Tom Lord wrote this was the intention of GNU developers: “As I recall, a lot of us working on the GNU software back then shared an assumption: that once the system was complete, there would always be work for systems programmers qualified to work on it, probably on an hourly basis, and mostly paying a little bit better than plumbing. It was well known that Stallman himself did occaisional $100/hr gigs: we imagined that completion of the GNU system would create a large market for such gigs, mostly working for direct end-users of the software.”

Many companies have sprung up around Linux to either support it or to build hardware or software around it. RedHat and Turbo Linux are just a few of the companies that have significant revenues and market values (albeit down from their late 1990s heights). Even companies that were known as proprietary software powerhouses, such as IBM, have embraced Linux as a way to sell more of their hardware and services. This is not to say that all software should be free or open source

Since it is unethical to use proprietary software (which you cannot share or understand), all software should respect users’ freedoms, which means being free and open source.

Although some of the more radical elements in the open source world would argue otherwise. There is room for proprietary, closed source software and always will be.

Proprietary software is moving into the network, so that you have freedom with all the software on your computer, but none with software running on servers which you access through the Internet.

But open source continues to gain momentum and support. Eventually it may represent a majority of the installed base of software. It offers an alternative to the commercial vendors and forces them to continue to innovate and offer real value for what they charge. After all, if there is an open source program that does for free what your commercial program does, you have to make your support worth the money you charge.

Again, this confuses “proprietary” with “commercial,” the most common misunderstanding people have about software freedom.

Gobuntu and the push for hardware freedom

Right now I can proudly say that all distributions of Ubuntu cost nothing, and that nearly 99.5% of the software and materials we ship gives users and the community the freedoms to share, customize and improve the software however they see fit. To help remove those bits and pieces that aren’t completely free, we have worked together with the makers of gNewSense put out a distribution called Gobuntu. We hope that some day that all distributsions of Ubuntu can carry 100% of the freedoms that 99% of Ubuntu already comes with. To make that happen, and to make an all free software operating system a reality, there is a lot that needs to happen, but, most of all we need your help.

Mark Shuttleworth announced Gobuntu a while back, and recently clarified that it is the new base for gNewSense and a follow-up article about it that suggests wheels are starting to turn for laptops that fully support software freedom too!

LUG Radio Live! Notes

These would be liveblogging notes, except they aren’t posted live, because my laptop has no free software wireless card drivers.

This is not a transcript. I try to note the essence of what people say, and although I type at demon speeds and often capture their phrasing, none of this can be quoted or taken as what was really said. My comments are in {} and audience remarks are in [].

I take these notes to remind me what was said, and publish them as they may be useful and interesting for you too.

I’ll liveblog your conference if you can get me there, and please send over corrections, or make your own - this is licensed to you under the GNU Free Documentation License.


{Arrived about 11:00, met dude who is doing really cheap (£30!) 3D scanning using a small lazer, a USB webcam, and free software (python and tkInter) and although its in a very early stage, it should be usable by Christmas!}

12:05 Alan Cox

(I was a bit late and missed the title of this talk, please tell me it)

Alan Cox: Business world has lots of managers, marketters, and so on.

Open Source has:

Incorrect Theory of Life #1

Humans are in two species, those that can code, and those who have NO USE AT ALL. So what do we do with those people? Can they help with projects?


One thing users are great at doing is breaking it. Write some software, use it perfect for 3 months, post it online, and within 2 days someone will find a bug.

People making PDF document viewers love it when a PDF document that crashes their viewer, because REPRODUCABLE bugs are fixable.

Users can report bugs, and will in theory prioritise them too, as people submit common bugs until they are fixed. Bug reports vary in quality; “I loaded a document, it said something and the window went away” or “I loaded this document and it said “error on line 194” - one is much better.

Error message should be short, something in plain english that users can understand. “Error. #1234. Please email this to”

If you have messages on screen that you can’t copy to an email, people attach screenshots, or even a digital photo of a totally locked screen and graphical programs doing really weird thing with the display.

Hackers don’t understand ‘beginners mind’ - when someones not told something is hard, they won’t dismiss things that developers presuppose are too hard. But, when users suggest it, then they can find ways its easy to do.


Trusted marketing information comes from other users who you already trust. A real user who props something is very influential, a manager who just met a sales exec is less so, a MS Foo advert is even less. Trust comes a lot from people in related fields.

Everyone is in the marketing team - every post on every mailing list, is part of the marketing. when you do something stupid that pisses people off, it will be remembered.

We previously used ‘evangelising’ instead of ‘marketing’ because marketing was meant to be bad, but we realise this was a bit daft and have junked it, although there are still titles like “Senior Evangelist for Java”…


There is lots to translate, English is not the most spoken language in the world, there are billions of spanish and chinese users, and translation needs language skills - being good and fluid in the language is required for good translation. most things start in english, although slypheed started wholly documented in japanese.

Translators have to understand the meaning of what they are reading. when we translated the Red Hat installer years ago, it said “If you have a sound card on your computer”. That means, file a bug! ;-) But it had been there for years. So you can improve the original text through translation.

you need common terminology, consistent throughout the program, and you don’t need to be a coder to work on this. its hard to understand why something doesn’t feel right, and you need usability testing to pinpoint subtle things like that.

Font Design.

English text is easy to render. Letters are roughtly the same size, similar position. Other languages are more fun. Some RTL, some are RTL and quote LTR english. Arabic and others have a shape for a vowel, and if it has 2 consonants each side, it disappears. so adding a charachter might not extend the line length. Pango is doing this really well now.

So translation is important, and mostly translators and programmers don’t overlap.


QA is hard. People out there are so devious - they find ways to use software the wrong way and break it. So how do you do testing?

In business, you do regression testing and pay some poor guy in India to go through all the menu options and tick boxes.

We in open source ship it, and the first 150 users test it. Fedora is like this, it tests the latest stuff with users, and gets bug reports back to developers. You want use on lots of hardware, in lots of use cases. Giving software to a large userbase does this.

You need to track bugs. Bugzilla, trac, tinderbox, etc. Bugzilla is not user friendly. Tinderbox means people can tell who did what bug and yell at them.

Debian compile software for a ridiculous amout of machines. So Debian developers will email you saying “it won’t compile on my VIC 20” ;)

Standards Compliance Testing

This is the most boring job I know! But some people love it, and get things into strict standards compliance. That’s good. POSIX with test suites, arguing over fine details of language - if you want a job that is useful and involves arguing without dong anything useful, being on a standards committee or body is a great job ;-)


Consider all the people who looks after the resources, all the IT infrastructure out there. Eg, Who wrote the Linux kernel? Linus Torvalds, and blah blah. Who runs [Pause for silence] See, no one knows.

There were lots of free kernel projects, but they didn’t get distributed reliably enough in those early days of the Internet compared to the Linux kernel.

People on mailing lists used capchas, but spammers offer people access to pron in return for solving captchas, and proxy the captchas from legitimate sites, so each time someon solves it for porn, the spammer signs up to a mailing list or posts a blog comment or whatever.


PHP is a superb example of great documentation done right. The Linux kernel is a good example of it done wrong. Bad documentation is worse than no documentation at all! Eg, we have documentation for the 1.3 kernel hanging around. We need people to read the documentation, note what is junk and throw it out.

Documentation is great QA. My wife got involved in free software documentation, and documented what everything did, so she found lots of bugs. One didn’t get fixed so the documentation said “This button causes the program to crash.” It got fixed real soon after that.


I was suprised to learn that not all are evil ;-)

A lot of licensing work in free software, eg GPLv3 negotiations, lots of stuff you don’t see, but is still going on. Finding who owns what. Sun freeing Java has a lot of lawyer hours ticking up, they call it ‘due dilligence’. Whats that? £150 an hour ;-)

GPL defence, and also stopping people getting into trouble. “I’ve reverse engineered a Broadcom driver, can I put this in the kernel?” Now the Broadcom effort has two separate teams, one reversing and documenting, and another implementing clean-room style.

Enforcement work, is very effective at enforcing the GPL, including taking companies to court who won’t settle. Why they didn’t use frebsd or netbsd is beyond me, but if they use the Linux kernel they need to keep to the rules.


The press treats everything as a company, so they treat free software as a company too, with Linus Torvalds as the CEO, and this is of course total nonsense. Eg, with all that Eric Raymond stuff earlier this year they said stuff like “The head of marketing has left” when all that was really was ESR making a fool of himself and going off in a huff.

Collaboration means you need to know what is happening to avoid duplication of effort. Mass media don’t care about the details for this. And those who do care have only 137 words a week for computer news, so it doesn’t get in.


Good ones believe the same things you do, the bad ones believe what the music industry believes, and the professional ones believe whatever they are paid to.

Politicians get funded by corporations, but elected by people, and their job is in a way to protect them from each other.

A great thing about lobbying is all you have to do is write a letter, and thanks to you just fill in a web form and click these days.

When politicians get lots of letters from epileptics who can safely watch DVDs by using computers to cut out all the flashing, but can’t because they need free software to do that, they will change things.

Most MPs don’t have a technical background, but most are bright, and they believe in something more than ‘thanks for the peerage, here’s my visa card’. They are there to serve you, so make use of them!

Look at the telephone market today, and you can see this will be more and more important. The natural tendency of large companies threatened by small incomers in a market is to seek excessive state regulation to block the competition. They have profit buffers to deal with the beaurocracy, but small players cant deal with it and fold. Wireless cards do this today; large companies say, think of the public good served by protecting emergency serivces and not letting people write their own software. Or voting machines - you need binary only voting machines in case people find security holes! Then you see the CCC site’s video of German hackers playing chess on a “secure” voting machine!

So that’s the list.

Fake marketing is ‘This is great, it does everything! Linux is great for hardcore Windows gaming!’

Real marketing is the truth: ‘I use it for web and email fine, I dualboot for playing games.’

People will realise themselves if whatever it is doesn’t do what they need, and they can tell easily, so tell it how it really is, if you’re in the marketing business for the long term - which in free software, we really are. The short term 12 month profit cycle is where fake marketing comes from - you see a great review for a film and you go see it, and it sucks, but that’s okay because you already bought the ticket. Right.

Software you use to run your life is different.


Audience: What about standards bodies and public service broadcasters?

Alan Cox: If you’re referring to BBC and the iPlayer, I decline to comment, but the EU commission is considering this issue! Otherwise, I’ve tried various standards bodies, responses vary from instant ‘Yes we’ll try our site out in Firefox tomorrow’ to no response at all. At Amazon and so on, its more about knowing the people to effect change. For kernel drivers, eg taiwanese companies who don’t know our culture, that needs a 2 way dialogue and sometimes you don’t win. Eg, talking to Nvidia some time ago, we understood each other but there was no middle ground. When a large number of people complain, that helps. CNN broke their video service for free software users, some people complained, nothing happened. Then many people complained and it was fixed. So there’s no mini handbook for reprogramming organisations because they all work differently.

Audience: Standardisation bodies?

Alan Cox: I have no real involvement in them - I find inconsistencies at a technical level, but standards bodies generally don’t get lobbied.

Audience: Hardware documentation. Is there light at the end of the tunnel?

Alan Cox: Microsoft introduced Windows Quality Assurance stuff that made it hard and expensive to certify your driver. so now people write to standards instead. They think “It would be 2% faster if we did it smart, but it will just work if we follow the USB standard, and we won’t pay for certificiaton. also laptops are now much more standardised around Intel chips, so they work much better. we get more and more contacts. if dell says ‘this server will run Linux and we’ll make 50,000 of them, and we need the network card to work,’ then things happen. which is great. whats not so good is 3D graphics. that’s good from intel, nvidia is doing they best they can, I think its bad and illegal, but they do it anyway. ATI have been promising for ages that things will be sorted out and everything will be fine for ages, and I think its more likely MS will ship Linux, er they did already, but its unlikely. AMD have a lot to live up to. But when they ask for CPU fixes to be done in the kernel, and we say “gee, I’m really busy writing free graphics card drivers to get around to that any time soon” then things start to change. A few years ago, you mailed hardware for drivers and they said ‘whats Linux?,’ then they said, ‘we won’t give away code,’ and so we asked for documentation and they said ‘are you worth our time, whats in it for us, are you worth it?’ And that’s a lot better.

Gerv Markham: Its easy to go from 0% to 50%, but hard to get from 50% to 100%?

Alan Cox: The big guy doesn’t care, but smaller players see cross platform as an opportunity, so its easier to get from 0% to 20% of a hardware category. and for standards, everyone but the biggest player wants to be part of a standard. there are sound economics for that. so it gets you started. but I don’t care, you get one vendor in the cateogory and you just buy their stuff from then on.

Audience: but when you hope your mums computer will work?

Alan Cox: SATA cards, from about 3 months ago, are all standards based now. same for USB, most things are standards based today, except 3D. there’s some consolidation there, but if there are 3 players and you get 1, you have a third of them.

Gerv Markham: but there are no intel plug in cards!

Alan Cox: I tell them this, but they don’t see a market for new machines with cards, I guess.

Audience: documentation?

Alan Cox: I worked here a long time, but I still don’t know why documentation is still bad. I worked in proprietary companies with no documentation for code at all. we found a security hole, and we couldnt know which other products with shared code were also effected. so its not just free software. leads the way. Give us time and we’ll sort it out.

Audience: Isn’t it better to write documentation alongside the software?

Alan Cox: In the real world, in proprietary software projects outside university courses, you write software first and then document it, it is not done at the same time. The best documentation is written by end users, they have the same level of understanding and thinking as other users, so they know things a programmer will forgot to tell you. So documentation is like that. but its often in wiki format, and wikis aren’t a long term information organisation solution. I don’t know what ubuntu is doing, but fedora is thinking about ‘how will you find this next year?’. I don’t know if its the same in other industries.

Audience: How to find more bugs?

Alan Cox: Use validation tools, especially for C. It turns up stuff that doesn’t happen but would crash if it did. Also fuzz testing, feeding garbage data to things. There is a 1980s tool called ‘crashme’ that executes random numbers and turns up all kinds of bugs, from in the CPU to applications.

Dave Crossland: What do you think about the Affero GPL?

Alan Cox: I just glanced at it, so I don’t know.

{A while back Alan wrote, “Maybe it is time the term “open source” also did the decent thing and died out”, so it was strange he promoted the term in this talk…}

14:00 Matt Garrett

=== Laptops: Why Linux still sucks on them and what we’re doing about it ===

I mostly work on Ubuntu, and I’m more and more upstream, in the kernel and fixing for the general case, not just the ubuntu case.

What do people want to do with laptops? Can they do it if they run GNU/Linux?

People want laptops to move their computer from place to place.

Apple sells 17” laptops because its place to place, not always ‘carry round with you’, not always. But carrying around means battery life matters, suspend/resume need to work. With GNU/Linux today, its not great.

Also doing presentations and doing work at a desk with a big screen, ie, plugging in external monitors. Again, not great.

Internet access with wireless networking is a huge use of laptops! Not great either!

Battery life.

What kills battery?

Screen backlights, especially on large screen laptops. We have no real control from software to optimise this. Its fixed in the hardware.

The CPU uses a lot of power, and we are suboptimal here. Various changes can be made here.

The GPU, especially running a 3D desktop, will drain things. Vendors are making hardware better at that, and we need to use those improvements.

Harddisks are spinnning all the time, and getting data from disk to RAM takes power. We can do better here too.

Screens. Not a lot we can do, we have 2W LED screens instead of CRTs on desktops now, and new laptops have screens with better power use. If your laptop is idle on battery, dimming or killing the backlight makes sense. Typically Linux drivers are not good at this, some vendors move the brightness to the chipset, like Intel. So xrandr -prop | grep BACKLIGHT shows the state of th ebacklight. So we have the ability with modern drivers to change brightness from X, but this sets the maximum brightness, so the hardware control keys don’t work as expected. But we’re seeing support in X, and we getting much more control over things, like 256 levels of brightness instead of 4. The main use for that is making fading animate really nicely, Keith Packard spent a while getting a really good algorithm for doing fading really nicely which is very cool.

Processor. Reduce voltage not speed. Halving the speed means halving the power. Since everything else is running at full power, but now takes twice as long to get things done, its no good. But halving the voltage uses a quarter of the power, and reducing voltage will reduce speed anywya. So we have Speedstep and Cool "and" Quiet which is called something else but its an abomination against the English language. {LOL there’s the classic #digi anger}

Also there is a new processor feature, C states, that let the processor turn parts of itself off, and this improves battery life immensely. Linux has supported this for a while, but its not used. Why not? There is latency with turning things off, and if you’re doing something in that time, you sleep/wake a lot, which is slow.

Powertop was recently release by Intel, it lets you know which application processes are doing unnecessary wakeups, and then these are fixed. Applications which you think are doing nothing are waking up 100 times a second! Its like they have ADD, maybe because their authors have ADD? :-)

I’ll demo this. We can see the wireless card driver, ipw2200 {proprietary software…} is the main offender. Maybe if it hasnt seen any packets in a long time, maybe it doesn’t need to check so often. So lets run a 3D application, glxgears, and we can see the grpahics card now at the top of the list. Apples trackpad drivers are bad, if you touch the pad, it sends 500 packets with no data. Its like 3 lines to say, I’ve seen no data from you in a while, so shut up for a bit. Something that’s not said a lot in society as a whole. So please run powertop, and find out what is consuming power on your machine, and tell it maintainers. I’m using about a fifth of what I was using about a year ago, and in the few months powertop has been out, people have really gone to town making things better.

Graphics cards. Same as CPUs, and although legend has it that power saving here originated in the mobile/embedded space, it was actually in the desktops. Some high end desktop cards took 300W, and so it was a good idea to use less power when the cards are not used. ATI did this with drivers, Intel does it automatically in hardware. You can also spend less time sending stuff to the screen, like framebuffer compression, and this is coming soon in Intel’s 965 chipset that is out this month, and the 961 in most Core Duo laptops already out there will support it. Again, we save about 1W or 2W this way.

Harddisks. Signalling takes power. The signalling bus to the disk takes power. AHCI supports low-power modes, and you can underclock your harddisk, although theres a performance hit. But if you’re on a laptop you probably don’t care about that when it saves you a pile of power, like 0.5W. Spinning disk down is an option here, but it takes more power to spin up, so when its worth it is unknown. There’s at least one PhD in deciding the best way to do it. People have commands to do this manually. Also, it reduces the life of the disk, and that’s okay in a laptop with less than 5 years life for most users. Many harddisks support acoustic management, and being quiet means less power because its spinning the platters less rapidly and not moving the heads as fast. And hey, disk noise doesn’t matter on an airplane :-)

Suspend/resume. This is getting a lot better, and getting to the point where we use less power than on Windows, and that’s not been true in a long time, not since the APM days. Best thing is to run the latest distributin that you can as bug fixes come in all the time, although we break things sometimes too, we fix them quickly. The ACPI spec leaves resuming the graphics chipset to the operating system, and a lot of functionality needs to be all totally right, so it is hard to do. A Core Duo needs to behave exactely like a 386 in a lot of respects, for most software to work right, but since the GPU and BIOS people are the only people writing software for this, in Windows world, we get a lot of really awful changes. NVidia could be more supportive for this, all they do is provide a proprierary driver that works sometimes and not others. Intel’s stuff works much better because its free software.

Multiple monitors. First case, a simple copy of your screen on the outboard display. BIOS key, function-F7 or something, might do it. Next case, two different desktops, and at different resolutions. The X tool randr, which stood for ‘resize and rotate’, was extended to allow adding monitors and changing their relation on the fly. Eg, xrandr --output VGA --left-of LVDS, and people are working on GUIs for this functionality. The free ATI drivers are adding this to CVS, and NVidia’s new G80x series’ proprietary driver supports it, and Nuveau driver is just about drawing on my screen now. Toshiba make things really hard, and its like they hate people, or hate freedom, or something. Not that I’d want to compare them to terrorists.

Wireless. This is getting better. Probably has to get worse first though. Softmac wifi is the winmodem of wireless. The good news is the G80211 stack, but the bad news is all current drivers need to be rewritten for that stack. Its called wireless ‘ethernet’ and old cards used to do all the wireless framing in hardware, so it really did look like ethernet to the kernel, which just passed some SSID and WEP data through. Rather than have processing and memory on the card, this is moved to the main system, and all the wireless card does is transmit things. People haven’t got really upset about this, yet, like we did with Winmodems. So the kernel network stack says to the D80211 stack ‘send this packet’ and that stack used to be hardware, but now we have a D80211 stack done in the kernel, but we need to port all the drivers, and then everything will be happy and shiny. Right.

Modems. How many of you use modems? [A couple] How many tried and gave up? [All] Almost allmodems are Winmodems now. Some have GPL violating drivers. The next version of Ubuntu will support some chipsets, only HP use those at the moment. There are patent problems, like with everything else. Why isn’t it done? No one is developing them. A lot of free software is developed by students in dorm rooms with fat pipe connections, and then they make getting broadband a high priority when graduating. Users who use modems - and outside Europe, USA Australia and Japan, modems are the norm - those users are not capable of fixing this themselves yet.

Summary. Its a bit shit really. But compared to 5 years ago, its awesome. In the next 6 months, battery life on laptops will improve hugely.


Audience: What do you think of Madwifi?

Matt Garrett: They use their own 80211 stack, copied from freeBSD. 80211 is complex, and when its done wrong, security weakens. All the stacks have bugs, and Ubuntu carries 3 versions for wide support. You go funny caring about this, and by funny, I mean slapping people in the face. [LOL] And the most obvious problem is binary drivers. Its not a Linux wifi driver, its a BSD driver, so people don’t like it. DadWifi is a port to use the Linux 80211 stack, which is great as we get rid of 1 out of 7 stacks. OpenHAL is the other and seeks to reimplement the binary-only parts. Cards without proprietary parts are mostly those before 1874, er, 2004. In the future there will be free atheros drivers integrated. So to answer the question, “Oh, God, No”.

Audience: What hardware innovation would help?

Matt Garrett: In the future, Intel pushing turbocache, which is a gigabyte of RAM integrated as a harddisk cache, would help a lot. Current caching queues writes until there are a lot, or until you need to read the disk anyway, like with EXT3 you can increase the commit interval. But the longer you keep things off-disk, the higher the risk of data corruption.

Audience: How much power is normal use?

Matt Garrett: 8W, so cutting 2W means 20-25%, which is a couple of hours. A 17” macbook is more like 20W. The 5 hours of life they quote is with the screen off and not touching it for 5 hours.

Audience: Availabilty of hardware modems? USR Couriers 2nd hand?

Matt Garrett: Anything that plugs into a serial port will need a COM2USB adaptor I guess.

Audience: People use Compiz, but can we turn off 3D cards totally when not using them?

Matt Garrett: Maybe if its idle, you can turn the GPU off totally and just scan off the framebuffer, in theory. Guess when 3D stuff is better supported, people will look at that. {I think this would be perfect for reading eBooks on laptops}

15:00 The Mass Debate

Jono Bacon: Welcome. We have four luminaries of the software world. Becky Hogge fom Open Rights Group. Chris DiBona from Google. Nat Friedman, cofounder of Ximian and now CTO for Linux at Novell. And Steve Lau from Microsoft.

Audience: What does he do?

Steve Lau: I’m an evangelist, whatever that means.

Jono Bacon: He tells lies.

Audience: What do you do Jono?

Jono Bacon: I’m the Ubuntu Community Manager. Use Ubuntu, its good for you.

Audience: For Steve Lau, why does Microsoft need to reimplement standards that already exist?

Steve Lau: We don’t aim to reinvent standards. The aim is interoperability. What are you refering to?

Audience: I’m thinking Open Document Format.

Steve Lau: We chose to do Office Open XML for interoperabilty for you guys. ODF doesn’t cover what we needed, but OOXML its published.

Audience: It’s not fully published. And you patented the schema.

Steve Lau: I don’t know if we patented it or not.

Audience: I can tell you. You did.

Gerv Markham: You need to make a statement that you won’t sue anyone for implementing OOXML, if its really about interoperability.

Audience: Should we be boycotting the BBC over the iPlayer?

Becky Hogge: I’m personally sympathising with the BBC, they were on the web before anyone else, but what are they doing with the iPlayer? I hope you won’t boycott them. At the Open Rights Group we tried to talk to them, most BBC people are Mac users anyway, they strike me as those sorts of people [lol] but I think we need to speak to the BBC more closely. ORG wrote to them, but it was ignored.

Dave Crossland: Is it better to distribute DRM media or no media at all?

Becky Hogge: I think if you’re gonna do DRM, you need to do it for all users on all PCs and Macs, especially if you’re the BBC. Don’t just let Windows users see what’s there, that’s not fair. Its a hypothetical question, but what EMI is realising is that with the sale of CDs continuing to fall they may as well have not gone into digial at all for all the profit DRM distribution has done for them.

Chris DiBona: I’m pessimistic about the EMI thing. I couldn’t believe it. It is a business in decline, and they’ll take as many down with them as they can.

Audience: What else should we do about the BBC, other than boycott them?

Becky Hogge: How would we boycott them anyway?

Audience: Stop paying the TV license.

Becky Hogge: I think we need to engage with them.

Nat Friedman: Being part of a company that did something objectionable to a large number of people, and I’m not familiar with the BBC situation, but if you want to see organisational change, people assume genius or a master plan, like organsiatons think out several steps ahead and there is intenral consensus on the plan. But things happen, not everyone agrees, but sometimes decisions get made by small numebers of key people. Inside Microsoft there are lots of opinions. Find the people in the organisation who can change things. Don’t polarise the disucssion. When the Novell-Microsoft thing happened, predictions were made about what Novell will do in the future, and one was Novell would be pushing Software Patents in the EU, and that didn’t hapen, instead we paid up the EFF and spoke against patents in the USA, asked Microsoft to say no to patents. Which is the sensible position for them to take although not everyone in there has realised that. If you boycott, that will hurt people, and some in BBC will say “Fuck them, they’re not watching it anyway.”

Becky Hogge: Only a few people care about this issue, so acting in that way will risk being written off.

Nat Friedman: If you want change, instead of focusing on the problem, focus on talking about the solution.

Audience: DRM is based on copyright. A TV license is based in law, and we pay for the media, but we can’t get it.

Chris DiBona: Did they say you will get anything? Law says you pay, and you get to watch TV? The law doesn’t say “BBC is a national resource”?

Audience: The BBC is chartered by the Queen, to deliver news in an impartial manner…

Chris DiBona: …for the UK…

Audience: …programming for the general public…

Chris DiBona: …in the UK…

Audience: So only supporting XP, excludes Windows 98 users, Mac users, GNU/Linux users.

Chris DiBona: Right. But you being right, doesn’t matter. It doesn’t mandate them supporting people digitally. Their whole web site, at all, is not considered part of the TV license benefit. The license system makes them able to make the decision for you, so they make the decisions that are best for them. What they are doing is wrong, but you’re not coming from a position of power, so let the Queen decide.

Steve Lau: Viewing TV online is a new means for access, and all the traditional access is still there. {And if you don’t like Microsoft Office, go use a calculator and a type writer…} By moving into that medium, should it be cross platform? They are starting by givnig access at all. The BBC website is best I’ve seen for mobile phones, their model is to make as much available as possible. DRM is enabling the abilty to share that content.

Becky Hogge: In the Venezualian television controversy they said, “You still watch it on cable”, but thats really censorship. So don’t deny its shutting people out.

Chris DiBona: BBC Dirac is this codec, in its infancy, and its takes along time to encode data, but if it develops, it could matter. The imprimateur of the BBC could really help adoption of a free codec. This is a hard coversation, codecs are owned in a patented way, and licensed out, and its a general disaster area.

{My unasked question: The BBC said publically that DRM will be cross platform, the issue isn’t we can’t see it, it is that DRM should not be done at all. Now Chris, you said DRM is bad, but Google Earth is a DRM application and Google shut down the free softwate Google Earth clients, didn’t it?)

Audience: GPLv3 has been released. How do you feel?

Chris DiBona: I was on Commitee A, I fed a lot of comments around. I was unhappy on how much they focused on the Microsoft-Novell deal. Any one bad act shouldn’t set where we go with such an important license. The Affero part was always doomed and a mess. Section 7 became section 6, and that part is a bit better, but not much.

Audience: The DRM provisions?

Chris DiBona: I love Doctor Who so much, I’m flying out to the UK to watch the show! There is no thing called Bittorrent. But yeah, there was the additional permissions section. Problem is, if you’re a consumer of GPL software, as you all are, and you need to worry about the acutal text of the license, thats hard to deal with. There’s this infinity of GPLs. Creative Commons says you flip a bit, and its consistent, and so now the GPL is inconsistant.

Nat Friedman: Google will use GPlv3? Isn’t there something that makes you release the gmail code?

Chris DiBona: No no!

Nat Friedman: I haven’t read it, I don’t really understand all that, but Eben and RMS were saying network apps would get published in GPLv3?

Chris DiBona: And they decided not to, because the border between linking and use is too blurred. Its like linking.

Nat Friedman: Theres a distribution out there that ships 9 or 10 binary drivers that break the GPL. They don’t ship the link.ko file, they ship the .o files, and they build the KO at runtime, so “you” do it. And I’m trying to imagine, the judge… [LOL] This is so close to the line, its creating new space next to the line! With this kind of thing, maybe you can totally ignore the GPL all round…

Gerv Markham: You cant build whole OS at boottime though…

Audience: Gentoo!

Chris DiBona: Super gentoo! LOL

Nat Friedman: So there are various interpretations of linking, and its all never been tested in court anyway.

Chris DiBona: Nah. When people say its never been tested, forget it. when you talk about tests, you have to consider juristiction, so are you talking about haralde welte’s german court? are you talking an NYC court? a judge said I was going to be in contempt of court for saying code was free speech. We’re careful at google about licenses, like VLC back in 2002 when we stripped all the good stuff out! So they considered web distribution as public performance, and they pulled it as the patents and DRM was todays fight, and they pushed it out to Affero for another day, and maybe it will be in GPL4. And at Google, we have people who can write software, so even if they did that, it would not be a problem.

Jono Bacon: Okay, Morcombe and Wise are done, lets hear more topics.

Chris DiBona: I could talk about GPLv3 all day :-)

Audience: Are Microsoft and Novell in competition?

Nat Friedman: Yeah, they still sell a lame OS, and we ship a faster, more stable, free OS. We’re trying to replace Microsoft desktops. Really. We do the only really large Linux desktop jobs, like 25,000 desktops coming from us and all new employees at those places get Linux machines, 50% are laptops, and so yeah, we’re totally competing.

Steve Lau: We compete, but collaborate where it makes sense. IBM is a big service provider, and we compete for some things and cooperative with other things with them too.

Audience: Google will ship its own GNU/Linux distibution?

Chris DiBona: We use GNU/Linux at work, its a good operating system. Looking at who we hire, Thompson and Ritchie and so on, it seems natural we might, but no, we’re not going to. Ask me in october ;-) But no, we do things you could see as OS like [lol] and there are goobuntu rumours, and before that, goohat. The Goobuntu story is that a tech guy in southern california posted on like a 30 person LUG list, mentioning he just installed the internal goobuntu distro on his work machine, and the next day we got calls from the Wall St Journal, and my boss was like “Chris, you didn’t release that did you?” lol

Audience: Does he still work at Google?

Chris DiBona: Sure! We all make mistakes, he was really embarrased.

Audience: Are individual computers still important, given web apps?

Becky Hogge: No, they aren’t important, personally. You notice them when you lose them. My laptop is a glorified thin client for the web services I use. I lost my laptop 3 weeks ago, and nothing changed, I couldn’t remember my Yahoo I’D for flickr, that was about it. its not really about the laptop. I was singing Dr Who in my head there for the GPLv3 talk. In the future the Open Rights Group will care about web stuff, not laptop stuff or desktop stuff, what happens on your own PC. Your own PC might not be your PC for long, the service you subscribe to will dictate, though live updates, what media player and what browser you have and what you can access. I’m not sure that’s a good thing though.

Chris DiBona: I agree. Next 10 years, its about getting your data out of online services. A ton of interesting computer science is going on about large cluster machines. Desktops matter because you need somewhere to retreat to.

Steve Lau: We agree on something, in terms of approach. The PC won’t go away, but reliance on the cloud will increase. There will be times when you don’t have, or don’t want, to have access to it though. so we’ll enter a hybrid software/services world.

Chris DiBona: And its all about trust.

Nat Friedman: the GNU/Linux desktop is a double edged sword. the more you’re able to move to a GNU/Linux desktop, because you’re not locked in to a proprietary desktop, correlates to desktops being less relevant in general. One trajectory is that Linux will really succeed when desktops just don’t matter. what Microsoft is doing, and you hear from apple and redhat and the gnome foundation too, is we’re going to build, as much as possible, a web desktop that integrates well with web based applications. Microsoft has silverlight and apple has .Mac like serivces, those are not modern any more though. You can even write parts of the desktop with web technology, the DOM and stuff. Another trajectory is to have radically different experiences on the desktop that are not available on the web, like 3D stuff, stuff that needs performance. So Apple has iMovie and so on, which is fine today but the Web might catch up. Another trajectory is that the OS is just device drivers, and the web is the whole operating system. There’s nothing you can’t do in the cloud, really. So that’s 3 destinies, a best web-desktop integration; a top desktop experience; or device drivers and a web browser only. We’re at an interesting crossroads right now.

Audience: Video games?

Chris DiBona: Tried WINE lately? You should.

Nat Friedman: Cedega too.

Chris DiBona: We have worked with codeweavers for picasa, and you know, adobe CS can nearly run.

Nat Friedman: Cedega is for games. Pay $5/month and tell them what to port next.

Gerv Markham: Do you have games that run on the Google backend?

Chris DiBona: You can run fractals on 1,000 CPUs for fun. Haha, that’s ‘fun’, a 1,000 CPUs…

Nat Friedman: Do you break crypto keys?

Chris DiBona: No, that is not very interesting.

Nat Friedman: That depends on whose key it is…

Audience: How can the community better interact with corporations?

Nat Friedman: Rely on the concept of reasonable majority.

Dave Crossland: Reiser4 had nice search stuff, Novell dropped ReiserFS two days after his arrest. Will the ideas in the ReiserFS ever return?

Chris DiBona: I’ll say this, in case Nat can’t, but, SUSE was dropping Reiser anyway. It had nothing to do with the murder. Were the ideas worth paying attention to? I thought it was interesting. I think the idea will persist.

Nat Friedman: B-tree directories are in other file systems. We’ll see the idea come up in other systems, I’m sure.

Chris DiBona: EXT6 has teleporting bits!

Audience: ODF is an ISO standard, will it be supported by governments. That’s to Nat Friendman and Steve Lau?

Nat Friedman: We’re the #2 developer after Sun. You know, Sun bought Star and the CEO open sourced it and then quit. Today, there’s a lot of back-room gaming going on,but I think we’re seeing success.

Becky Hogge: Foundation for Free Information Infrastructure in Europe has a site,

Steve Lau: OOXML is going through fast track…

Audience: why fast track?

Steve Lau: get it out there and let people choose a standard. We’re not railroading people…

Jono Bacon: You’re railroading a standards body… [LOL] Was ECMA fast tracked?

Steve Lau: Fast track still has massive detail review. You can read the spec.

Audience: I would read it, but its 5000 fucking pages. [LOL]

Audience: I have read it, and parts of it are very vague. “Do this like Word 95”. Why do we need another XML document standard?

Steve Lau: We have extra granularity, so you have more control.

Audience: More control. Mmmm.

Nat Friedman: I haven’t read it, but we have people at Novell who have read it, and Micheal Meeks and so on, who did participate in those discussions.

Chris DiBona: Microsoft has a dominant monopolist’s share of the desktop. When they do something, we all have to support it. That’s life.

Nat Friedman: There’s so much functionality in Microsoft files, its true. And a french company does a plugin for exportig ODF from Microsoft Office.

Chris DiBona: Its only for french documents though ;-)

Becky Hogge: And Microsoft Office is 95% of the computers, so we need to get peoples’ awareness raised, outside this community.

{I dropped my laptop and the harddisk popped out! But Nicholas Butler ( on the Ubuntu stall helped me out, working out that the HD had popped out, and its working okay again :-) }

16:00 Chris DiBona

Chris DiBona: Here’s a photo of the original Google hardware. A bunch of grad students made it, so they went around the buildling looking for computers that weren’t plugged in, because they obviously belonged to them ;-) And reliability was pretty bad, the ‘reliable’ power supplies were worse than the normal ones when I worked at VA Solutions when they still sold hardware, so Google decided to forget about hardware reliability and get reliability in the software layer. You know, we didn’t even use ECC memory until a few years ago. So here’s the first racked set of machines, and here’s a more professional rack. We could push installations like this up in 3 days. At sourceforge, our floor-fan caught fire!

So, what am I here to talk about? Google and Open Source.

Someone once said {I didn’t catch the name} “A license can ruin a perfectly good piece of sofware”.

Where did Free Software come from?

FSF, GPL, Emacs/GCC. If you know RMS, you know why he made Free Software, it was to fix his printer, and why he regrets doing the LGPL, the idea is that he paid for something so he should be able to get the full use of that thing. Its all about power in the hands of the end users.

Where did Open Source come from?

People wanted to give code away for other reasons. Here’s Tim O’reilly. I was at the moviesacross the street seeing some really cool movie, instead of that meeting, so you can’t blame me.

Open Source licenses are getting more, and worse. Some vendors dont like FOSS and would like to kill it, and some say they are open source when they aren’t, like, “We’re open source, we’re using XML”. [lol]

Why do people do it? A BCG survey said as follows… {Missed this but it was an interesting list!}

Google uses the Linux kernel, Apache Tomcat and a lot of other Apache stuff. The most important stuff is the languages and compilers though, python, perl, we use a lot of C++ and use GCC for that. We get a lot of utility from open source.

Why do we use it? Control and independence form external software companies. We can drill down to repair anything and enhance our services. we can do unusual stuff without showing our hand. No one is incentivised to hurt us. And it appeals to the root Google Ethic. Eg, working with france telecom on a push to talk thing, we had to wait 18 months for critical a bug fix.

A lot of people hate us and want to kill us. It seems funny on Slashdot but it is real. Our compeittors feel badly about us and are willing to push that. We have to be compeittive, and not tell people what we’re doing before we launch. So we’re able to launch when we want to, and so we have better launches. We rarely sign agreements with CPU limits, so we release with how many CPUs we want.

We control our own destiny. We all should be loath to give that up, as a society. Now, here could be the end of the talk.

Amazon, eBay and other top tier websites use open source, but don’t really give back. We do. So now I’m going to brag about what we do. About a million lines of code have been released. Including the Airbag Crash Reporting tool now used in Firefox.

Gerv Markham: Yes, in the latest builds. Before those, we use Talkback, which was proprietary and really really sucky, it depended on Oracle, it needed the same kind of machine to recieve the bugs as the platform sending them, so you needed Windows and Mac OS X servers, it was really bad all over.

Chris DiBona: WOW! I didn’t know it was that bad. MacFUSE, userspace filesystems for OS X, will be in next version of OS X. We get lots of love from Mac people for releasing things as open source, and not so much love elsewhere. I don’t know why that is. And we made a hosting site last year, we’re hosting 40,000 projects now, its SourceForge like yes, and has some new features, so sometimes you need to get used to it, but its one of the largest SVN installations out there. We’re happy to be the 2nd largest, there’s no animosity and I used to work there, its all cool.

So Googlers patch like crazy, we keep it efficient to do so, although we rarely require email addresses, so you often don’t know its us. Here’s a list of projects we’ve contributed to {including ICU!}

We hired a lunch of prominent open source people: Andrew Morton who is one of Linus’ Leutentants, Guido van Rossum who wrote Python, Greg Stein the Apache Chairman, and Jeremy Allison of Samba fame.

Audience: Are you using Linux 2.6 now?

Chris DiBona: Yes we are now.

So we do Summer of Code, where my boss said to me, “There are too many computer science student who don’t code over the summer. Go fix that.” and that was all he said, so we’ve done that for a couple of years now. We pay the projects who take on students for taking part, and students get something to start them off, and something if they are working, and final money at the end and you get a certificate and so on. Someone asked us for a “Failed” certificate, but we thought that was lame and didn’t do it. {lol!} And we budget a bit for people who are deranged. You have to. Here’s a graph of students’ countries in 2006. In 2007, india and china are much bigger.

Gerv Markham: At Mozilla we see that Indians are getting a lot better at writing proposals. Chris DiBona: people say we take people away from Open Source, so we bring people to it too through this. And we get exposure in countries we’d never get to in any other way - there are great developers everywhere!

Dave Crossland: You said it was important for us all to be in control of our own destiny. Are GMail users in control of their own destiny?

CD: Yes, you can export the addressbook as CSV, and download mail as pop3. IMAP is coming. And that is more than anyone else is doing.

Dave: Do the users of an Affero GPL licensed web-based application have more control over their own desinty than users of a proprietary web-based application?

Chris DiBona: I don’t think so. No one will use it, its just too hard. We at Google won’t use it. GMail is designed for a server farm of say 500 machines, to run in a datacenter, which you don’t have at home, so its useless anyway.

2007-07-07 LUG Radio Live! 2007 Sunday

Gareth Qually, Michael Sparks, Nat Friedman, Becky Hogge, Gerv Markham, Neuros Guy

11:00 Gareth Qually

Gareth Qually: Using Mac OS X, I’m not a programmer but I love computers and pushing the limits of whats possible, and last year I wanted to see if open source was a viable alternative for what I use to earn a living. So here’s an example of what I do. {Its a nice motion graphics show reel}

Gerv Markham: That’s what you show to people to persuade them to hire you to do their graphic design, and you did that using tools like…?

Gareth Qually: Yes. So can Linux and open source tools work for normal artists? Some of my compatriots in the industry are lazy, they feel their way around, that’s part of the artistic mentality, but I like to throw myself into things. The tools we currently use are:

Photoshop. A capable, reliable tool, never crashses, or like once a year, with conversative development. I still use version 5, because from then to 8 or 9 its been only small changes, mainly interface changes, and what it does, it does very well.

Illustrator. I used it to make these lug radio signs, and everything is Adobe on the 2D side for me. I use this it to import drawings into my compositive and 3D apps. most graphic artists are more illustrative and wholly 2d focused though.

3D Studio Max. This is our 3D tool at work. A reliable piece of software, in need of revision as its getting a bit old, but used a lot in games and TV and films.

AfterEffects. The motion graphics compositing application from adobe. when I first used it, the interface drove me up the wall, but now there’s nothing like it for motion graphics, imo. its timeline-based not node-based, like blender, and node-based is better for special effects.

So what are the Open Source equivalents?

Inkscape. I took to it like a duck to water. The default key commands worked well for me. The feature set isn’t as wide as illustraotr, but that didn’t bother me and for a little things it was totally fine.

Audience: is this running on Linux or just looking at open source apps in general?

Gareth Qually: Previous versions of jakasha didn’t work in Linux and I went to windows for that. so this is on my XP workstation. so, inkscape is a good package, I never looked at the documentation once, so for fellow artists who certainly won’t, it will work brilliantly.

Gerv Markham: so inkscape is a reasonable replacement for what you do with Adobe Illustrator?

Gareth Qually: Yes, for me. As a full illustration tool, an artist might find it lacking. I read that the SVG format doesn’t have certain features though, and Inkscape is meant to be an SVG editor.

Gerv Markham: So Adobe Illustrator is vector based too?

Gareth Qually: yes, and its used for designing logos and such. so here is the design work done in the open source work flow I tried for this talk, starting with the Creative Commons logo.

GIMP/Cinepaint. That was love/hate for me. I wanted to get my scanner working, but I didn’t manage to. A parallel port scanner, and there were drivers out there, and it initialised, but didn’t scan - that’s more a fault of mine, not knowing the operating system, than anything else. My wacom tablet worked well! With Gimp, painting was awesome, I loved it, and the filters were great and did things I’d never seen before. but then, here’s this image, I wanted to use it as a mask, I couldn’t figure out how. Maybe that’s me not reading the manual, but I don’t want to spend a whole evening scouring wikis to find out how to move RGB to alpha channels. I know gimpshop mimics the photoshop interface, but I found it crashed a lot. The windows version of GIMP wasn’t as good as the GNU/Linux one. Blender was good, but a lot faster on GNU/Linux.

Video editors. We are really lacking here. You have kino and pitivi, but they are consumer level, for making simple cuts. I started learning cinelerra, but ran out of time. we need something half way, akin to final cut pro or avid.

Audience: did you try Open Movie Editor?

Gareth Qually: no, I just didn’t see it, and the general artist will be doing the same thing as me and won’t see it either. {I hope they become more well known then!} Jashaka had a video editor, which seemed fine, but it was basic, more of a layering system than full on editing.

Gerv Markham: Cinelerra is the 800 pound gorilla in the space, its hard to get it going though.

Gareth Qually: now we come to blender. I love it, a shining example of open source done right. blender will be one of the Big 5 3D applications in a few years. fluid dynamics is in there, stuff going in in a rate of knots that’s incredible. I like the interface, its intuitive once you know it, and its aimed at speed. it has great manuals, the docs are brilliant and it crosses the line into the commercial side as you can buy their manuals in shops. That’s a big advantage and other projects will benefit from that. you can create different UI layouts for projects, that’s nice. The material editor was tricky, but with documentation and video tutorials I got around all my problems easily, and I enjoyed it. It was rock solid, maybeit crashed just once. Compositing has huge potential for changing the graphics software market. not a fork, but if the compositing was put into a separate application or made a bigger major-mode style part, it could have a big effect. The compositing market is in turmoil as Apple said it is killing Shake, to bring out a new one at some point, and they haven’t said when that might be.

Audience: What is compositing?

Gareth Qually: Its like Photoshop over time. You bring in bitmaps, vectors, 3D models, and film objects, and put them all together. With special effects in films, half the magic is in the compositing - its the key to everything working together. This is where I had problems though. Jakasha has great potential, but its development has seemed to slow down; the forums are quiet and there have not been any new versions for a while, so it may have stalled. blender’s compositing is node based, so its great for effects but not motion graphics. So I was having a good time, but got stuck at this point. We’ve come far, but not far enough for complete motion graphics production on a GNU/Linux system with open source software.

So here are some demos. I did this background with Gimp, I did this logo with Inkscape, I scanned a page from a magazine using windows drivers, put it through Gimp filters, 12 different positions that are put together like marvel film introduction sequences. And this Creative Commons cassette 3D model here is in Blender, and I put them all into Adobe AfterEffects.

AfterEffects has compositions inside compositions, and so this is an example a fake Creative Commons show’s title sequence. I could in theory do this in Blender, but with no real time update its really hard. But the ground is set, things are there for people to use and develop. How it gets solved, I don’t know…

I have 3 points to sum up:

  1. Standards, in interfaces and formats. To get a lot of artists using these applications, we need to make things familiar. Gimp has the version that looks like Photoshop so that people who use gimp can feel it is like photoshop. The same key bindings, the same interface, that helps adoption a lot. For formats, we need to support the proprietrary ones. Today PSD files are well supported in GIMP, but 3D formats are more difficult. Blender is making good in-roads there though.

  2. Advertising. We need to entice artists in! A shining example is the Orange project from blender, a short film that pushes the technical features and shows its a capable, production-ready application. People are amazed by it, it kicked off debate in the graphics circles I’m a part of.

Gerv Markham: Should they redo it with an understandable script? ;-)

Gareth Qually: I thought they’d make a prequel or sequel that will explain it ;-)

  1. Documentation. In general, its convoluted for an average person to learn things. Blender has made really good books. Another thing is movies, video tutorials. The commercial world now has DVDs loaded with tutorials that get people up and running, and that works very well. Users can make them though, and that’s a way forward.


Audience: If your company wanted a new feature, would you consider hiring a programmer?

Gareth Qually: MERL and ILM have huge programming teams that are always coding, customising the interface, and being involved in open source projects is very possible for them. But they keep their things secret, to give them an edge. For smaller companies, its harder and an 8 man team won’t see a 9th programmer as all that helpful. But it could work, sure.

Audience: Perhaps video editing features are not good in blender because its a 3D app?

Gareth Qually: 3D is fun and sexy, video editing is less fun, with more low level uninteresting technical detail. But its vital to find people who are interested in those details!

Audience: Do artists real manuals?

Gareth Qually: I’ve never read an Illustrator manual. With design tools, I just want to get in, get something done, and get out of it.

Audience: Artists often know specific technique recipes, and use them precisely, so even if something else can do those things too, they don’t want to learn the new recipe.

Gareth Qually: Yes, some artists are like that…

Dave Crossland: Is the main motivation for artists to use free software, that its free as in no price?

Gareth Qually: [lol] Well, probably. but Blender does have a good community and does see features added all the time, and actively contributing users, and that’s good.

11:30 Michael Sparks

{This was an hour session that I missed the first half of. Michael is the lead developer of Kameilia, the BBCs python concurrency framework. He’s using opensuse. Micheal walks through some code that he says is in the project SVN, in his username’s projects directory, listed as lugradiolive.}

Micheal Sparks: This all fits into a single file that’s not too long, and you can keep it in your head. At the moment, Kamaelia is already a highly capable platform, and 0.5.0 reflects where we are and where we want to go, not stability - it is very stable.

It shows you can make concurrency easy to work with, even in a normal language. Please, build your own version, following the mini axon tutorial on our site. If you do make your own, we’d love to include it!

What is mini axon? The core concurrency stuff is called axon, and that has components on top. so here’s a simple fibonacci sequence algorithm. now lets tie that to a generator, and we can see with its going up the sequence with each call. Now we make it a class, a microprocess().main() object, so we can run things in parallel. now we make message queues, with boxes, inboxes and outboxes.

This is all just a single screen of code, and if you know python, its good to go. But even if you don’t this tutorial was first written for a student who had just finished his A levels, who learned python the previous week.

So that gives you a walk-through of the mini axon tutorial. We use this as a test for our Sumer of Code projects, to see if applicants will do the work, and if they can.

So most of the code apart from generator are easy to write in another language. we have c and c++ examples of generators, and other languages.

So take it, use it! If you can’t do cool stuff, its a bug in the system - because the point is to make it easy to do cool stuff. Steal the ideas, take them and use them in your preferred language. Whatever! :-) We hope you’ll learn to deal with concurrency easily. people think concurrency is inefficient, but each generator is like a state machine, the outboxes are intemediate variables. Most large scale projects use state machines and intermediate buffers, we’ve just made them explicit.


Audience: What were the hard things to implement?

Michael Sparks: Mindset. none of this is 1st iteration - everything has been rewritten several times now. people do premature optimisation and avoid concurrency, but that limits scalability. so we avoided all optimisations in the beginning, and got the concepts sound. now, unix pipelines have been doing this for 30 years, but our system is objects, not files/data. Ocam did similar things 20 years ago, and Erlang is similar too, but everything had to be done in those languages. so this whiteboard application is done using kamelion, and it was easy to do with Kamaelion when making it networkable and efficient. You can have 100 components that are single threaded, but with Kamaelion, we can scale that up over a 100 CPU machine and have it perfectly efficient. Its an idea whose time has come. People try to modify their language to do these things, and yet we still have cobol code being cut today - modifying the language is not a real world solution. I don’t know why its not been done before, it just seems obvious.

Audience: Perl?

Michael Sparks: My background is in network systems, with some C++, and when we do something we give it a friendly metaphor to make things easier.

Audience: I had a swap system in java, using queues instead of inbox/outboxes, and its componentised, but then you worry about the queue’s order, and write managers for the queues, and it gets messy. this seems great!

Michael Sparks: you can put that stuff in the framework, and its easy to build your own frameworks.

12:00 Nat Friedman

Whats this about? Not anything in particular, a grab bag of thoughts. I’d like to talk about lots of stuff, but interrupt me, maybe I can talk about the MS-novell deal too.

First I want to tell a story.

My story begins in London, in 1857, with 3 men. Richard Trench, Herb coldridge, and James Furnival. trench, a clergy man. coldridge, a barrister. furnival, an heir. part of the london philological society. Word nerds. Trench has this cool quote about words being amber that traps thoughts. in 1857 they formed the ‘unregistered word committee’ because dictionaries were missing lots of words, their illustrative quotations were poor, they had incorrect etymologies. trench wrote this paper about it and they proposed a totally comprehensive store of all english words.

My dad was from Oxford, and we went there when I was 12. We stayed in this hotel, we were checking in, and the guy said ‘welcome back Mr Friedman’. My Dad didn’t think he’d stayed there before, and they got out this big old book, and actually he’d stayed there like 19 years ago. I was like this a 12 year old nerd, and programming in pascal and what not, and I had this realisation over what would happen when all the hotel records were computerised. haha. so anyway, those 3 guys made The New English Dictionary. if they were nerds, it would be “NED”.

How? They divided it into A to Z sections, establishing subeditors for each section. beforethey got started, Tench died, so coleridge became the first editor, and he published ads in journals of the day for contributions. He had this pidgeon hole system for submissions, it was their revision control system. in april 1861 he did the first release, after 4 months,so he got it out fast, it was just like some words starting with A and people didn’t really look at it. Then he got tuberculosis and died. So Furnivall was the only one left and he was enthusiastic, he got over 2 tons of citation slips in, he was really into things for a bit and then started other thing. He founded Universities and societies and loads of stuff. Remember, he’s an heir. He started new projects and lost interest in them, he was totally undisciplined. A lot of energy, but a lot of bugs, and hard to work with too - 7 subeditors quit in unison, and he resigned in 1870. He’d had 9 years as the editor, and he didn’t make a single edition. he was totally disorganised.

so those 3 were gone, and this guy Murray took it over. he was monomanical about it. he had like 7 or 8 children, and he set them all on it. [lol] He set up a bazaar model, placing ads in cheapest newspapers and got lots of people into it, and built the Scriptorium. That was this building next to his house, with 1000s and 1000s of cubby holes, and he got like 1,000 slips a day. One of the big contributors was W.C. Minor. He was a murderer, in an insane asylum during his contributions. He killed someone, and lost his mind, but had that obsessive mind for making a big contribution to a dictionary project. He cut of his penis in prison, because he thought he was super sinful for masterbating. But they didn’t know all that, and were glad he contributed!

You know, open source is like this, you get these weirdos, but they contribute. sometimes you have to budget for them, like Chris DiBona said earlier.

Anyway, in 1882 they had 3.6 million slips, and in 1884 they had 352 pages, from A to Ant. 23 years of work, and they sold 4,000 copies. 1928 was Oxford English Dictionary 1.0. It was the first open source project. In 1933 they did a reissue, in 1957 a new suppliment, and in 1983 it was computerised, with 120 typists down in florida. Wikipedia started there, btw, and its now $30/month on

This was an open source project, and in many ways open source is a side effect of having a network available, and that’s true, this was the first Open Source project that was caused by the postal network. {This seems like a fallacy Gerv Markham covered, where something happens after something else, so a fake causual relationship is postulated? As I understand it, Open Source happened after the IP network was very widespread (98), but is caused by software developers having software freedom.}

If you can unleash people who can produce things but are idle with a network, you can do great things. The Oxford English Dictionary is the first example of this.

I recently saw Luis von Ahn do this awesome Google Tech Talk on Captcha. { } Captchas are puzzles that proove that you’re human. He said 9 billion hours of solitaire were played last year, and it took 10 million hours to make the Empire State Buildling. So how can we capture people’s idle output? There are three ways.

  1. Pay them. Ken Kirk style, pay then ten cents to type something.

  2. Ask them. People often volunteer their time.

  3. Make it into a game. That is, architect a system where they act in their own self interest and also help you.

He invents games that are fun to play, but by playing, you build databases. One is pikaboom, where you see an image and a word, you click where the word is in the image, and the other side sees that area and guesses the word used to descibe that thing. so use create this verified database of images to words, useful for image recognition.

For a project to be successful, you also need people who dedicate their life to it, but in itself is not enough - Furnival was dedicated but had other, er, features. The best projects are led by a focused workaholic.

The architecture is important; even if Linus had the Windows source code, it wouldn’t allow anyone to drop in and help. The OED was like an hour of work per person that anyone could do. Firefox has the architecture of participation, both in code and socially, and “open source” companies either miss one or both a lot of the time.

So, now, the Linux desktop.

Here it is in 1992. Nice x dipple pattern! A calcuator, did you know you can use .Xresources to change the colours of xcalc, which I did - you know, I had the most badass xcalc around! And there’s a motif calendar app using lefftif, and its June 1992, and theres this round clock, a shaped window - that’s very cool.

Then in 1997 this is GNOME. Its like a brag about ‘all this shit I totally compiled, I can’t believe it myself, its awesome’ lol so there’s gnumeric, Linux users didn’t need spreadsheets but it seemed poopular so we had one anyway. Here’s a calendar that says ‘visit mother’ - awesome. And here’s a colour picker that’s very user friendly, and a GUI hex editor, showing the audience for this… Heres another desktop from about the same time, with the swiss chesse theme.

Usability. It was built by geeks for geeks, and there are a lot of geeks, but there are other people too, who need usabilty. so heres something about the human mind: our brains are slow compared to computers. There can be up to 100 transations per second in an neuron, but there’s a lot of neurons, so the mind is massively parallel. The visual cortex takes data from the retina, does pre-attentive cogniton, and sends the data to rest of brain. The retina itself actually does its own processing, I’ve seen videos about the eyes preprocessing.

So, lets do a little quiz to test your brains. Its going to be good - I show you a slide, and ask a question, and you shout the answer out fast. 2 + 3? 3 x 7? Good. Okay, how many dots? How many red dots? Fuck! [There are no red dots lol]

Okay, let me change it… okay. I can’t believe that. Let’s go again! So, how many circles? [4] how many red? [6] how many circles? […] [Its both circles and colours, so you can’t use your pre-attentive cognition to filter it, so answering accurately is really slow or you guess fast and are incorrect]

So SUSE ran the project that published all our vidoes of user testing, and we took the videos to our engineers, and they are like ‘Awww, fuck’ [lol]

If your application is smart, but no users can figure it out, it is no good.

How does Novell contribute to open source? 1.4 million Lines Of Code to Evolution, 5.6m on mono, 200k on beagle, 300k on f-spot, 170k on banshee, 85k on compiz.

I started Ximian, so I like to see people writing code. Also openoffice, we are cairo’fying it ad getting better Microsoft document support.

Did you guys see this EU report on the economic effect of FLOSS? They looked at 986 copmanies, and #8 was Ximian, with 5000 person months, at a cost of 30,000 euros - which is interesting because we didn’t have that many person months ;-)

And theres a list of smaller prjects. Greg Koah-Hartman is at Novell, working on the kernel. Did you see his open source driver project? He asked for hardware specs and got a bunch! If you publish specs, people will write drivers for no money anyway, but the way he put it, we got him to work on open source drivers full time. Binary kernel drivers are bad because you’re violating the developers’ rights, everyone understands that. Theres a technical thing too, if you load binary kernel modules, it might crash and means we can’t help at all. So watch that space.

Audience: Your company did a podcast interview with Greg in April, by the way

Nat Friedman: Other things we do: Patents. The open innovation network. a lot of people don’t understand patent law around the world. There are 4 types of IP law: copyright, ownership of works; trademarks, ownership of a word; patents, ownership of ideas or ways of doing things; and 4th, trade secrets. They all have totally different rules.

With trademarks, if you don’t enforce it, it goes away. So google worry that using “google” for “search”, if they don’t defend it, it becomes a word anyone can use. This happened to band aid, or kodak, xerox, kleenex - in the US these words are synonyms for the general category, so people are rough about trademarks because its use-it-or-lose-it.

Patents aren’t like that, you can wait 15 years until everyone uses your stuff, and then sue all of them. Donors built whole buildings at my uni, MIT, by doing that. When you first file, its all novel and inventive, but it takes 4 years to grant, and by then its become obvious and common knowledge, so peopke are using it, and you can now sue them.

Stuart Langridge: Sorry to interrupt, is the iPlayer guy in here? They hate you in the other room. Can you go over there?

{I went over there too}

Becky Hogge

Chris Hanham: Okay, first, its not the BBC’s fault. They have a lot of content from rights owners, they won’t allow distribution. Windows is the only platform with a valid DRM solution. It had no discussion after that, otherwise the content wouldn’t be out.

Becky Hogge: This is a good community at doing stuff, what can we do?

Chris Hanham: People in the BBC know its a problem, but most of the public don’t care, and there is nothing we can do.

Audience: If the BBC is buying content from producers, the BBC has negotiating power, the BBC has a say - its not passive, it can specify those terms in contract negotiations.

Chris Hanham: These companies have a valid interest in DVD sales. They need expiry on the content or you’d just keep it for life, they make a lot of money on DVD sales.

Becky Hogge: Its a valid concern, but EMI are starting to rethink that, so its a poor time for the BBC to start, when they are moving away.

Chris Hanham: I’m employed by Seimens, not the BBC, by the way. So DRM is a temporary solution, its costs more to buy a non-DRM track with EMI, and they’ll reconsider if there are lots of complaints. There is 4od {Channel 4’s on-demand internet-tv service}, and other ones. My friends have used it and its not that good, same with Sky. Microsoft DRM isn’t ideal, it breaks a bit. People get a ball rolling, telling them how bad DRM is, and it will change. Sky, people post their problems on the Sky forums, and someone else posts a torrent link to the show they missed. Its a dam that will burst in time.

Audience: So there are BBC people who know about it, but aren’t worried about it? Who are they - who can we push in the BBC?

Chris Hanham: How many here use user-agent switchers? The BBC looks at that for Operating System statistics. But yes, tell them, complain.

Gerv Markham: If there were a similary functional DRM system for Mac and GNU/Linux, and on an open platform its perhaps easier to crack DRM systems, but if there was an ‘open drm’ system on Mac and GNU/Linux, would that solve the problem?

Chris Hanham: Yes, there’s no other DRM available. People with rights won’t push things. But DRM is a pants idea, it will die anyway.

Audience: People are moving away from Windows, but now there are reasons not to because of the Microsoft DRM lock-in.

Dave Crossland: Would it be better to publish nothing with DRM than publish things with DRM?

Chris Hanham: I don’t know, you can get it all on BitTorrent. Its better now as there’s a discussion - if there was no content published, there would be no discussion.

Audience: On the Backstage mailing list, BBC managers said if there was a cross platform DRM, they would use it.

Becky Hogge: ORG is developing CTO training materials, in conjuction with London Development Agency, to educate business leaders about how non-DRM content and open content can be good for their businesses.

LUG radio guy: In LUG Radio’s last episode, an idea was for DRM to be in the network, not the files. So only offer the download for 7 days. That’s cross platform. Why not do that, instead of DRM in the files?

Becky Hogge: The Open Rights Group suggested streaming in our report, was that considered by the BBC?

Chris Hanham: If it was, I didn’t hear about it. With streaming things for 7 days, you can still download the streams and re-distibute the files after the 7 days. Also, streaming costs too much in bandwidth. DRM is poor, and they’ll realise eventually.

Nigel Smith: I’d like to question the idea that providers won’t give content to us. The BBC has big purchasing power. With DVD sales, its great for them they will get paid twice for the same thing, but its not good for us. If DVDs have DVD-specific features, that’s good though.

Chris Hanham: A production company can take their show to Channel 4 or any other broadcaster, if the BBC terms are too harsh.

Audience: There are online music stores like Magnatune and Jamendo, and Amorak and Rhythmbox supports them - they are places to get media, DRM-free. So the way to do it is to support them, let the DRM people go the way of the dodo.

Chris Hanham: yes, that’s right, there’s natural selection in the end. People are wiser and wiser to DRM, the market will have to drive change.

Gerv Markham: Regardless of the moral value of double charging, an argument which has merit, there’s biting off more than we can chew. If we ask the BBC to release old content, going back to each rights holder, renegotiating, paying more money, it seems unlikely. But we can ask the BBC to not cock it up again when negotiating new contract. But its not helpful to ask them to do that for past things, the business case isn’t there. So lets focus on the the battle we can win.

Becky Hogge: Thank you Chris Hanham from the BBC! Lets hope you have a job on Monday! ;-) And support the ORG and join as a member!


{Back to Nat Freidman]

Nat Freidman: ..We don’t look for patents, if we get told about infringing one, we pull out code and code around it. We might possibly try to get a free software patent license, but thats not usually neccessary.

Audience: Will the GPLv3 change things?

Nat Freidman: GPLv3 has no effect on Novell, it doesn’t nullify the agreement, but has a strong effect on Microsoft. If you convey GPLv3 code, or secure its conveyance, you’re bound by GPLv3 and you’ve licensed all your patents for all dowstream users. So Microsoft would give away all the patents with the SUSE coupons. IANAL so I don’t know, the Microsoft lawyers have to disagree loudly on principle. But eventually, they’ll have to get on the right side of the patent issue, it just makes sense to.

Audience: open source can be seen as a side effect of networks. looking ahead, networks are used more and more, especially with wifi, so how will the operating system landscape change?

Nat Freidman: Collaborative intellectual property is spreading everywhere, wikipedia is doing a lot there, Chris DiBona’s slide listed Summer of Code countries, and that correlates with electricity and internet access. I had friends who programmed on paper growing up, and they became great because they didn’t care about syntax. I was reading in Chris DiBona’s book Open Source 2.0 that its like missing the bus. Our goal was a fully open source Operating System to replace Windows, but we’re not in that world any more.If you want to run a service like Google, there is no equivalent to open source. IBM gave away the farm to Microsoft with the IBM PC, and open source is like that with web based applications. That’s a big change.

Thank you!



15:00 Joe Born

Joe Born: The path of the $100 media center.

When you say machines, that implies PCs, but a lot of devices are PC like now. every device in your house is, really; phones, TVs. and its ubiquitous around the world when its a sub-$100 device, not just in the west, but everywhere. OLPC gets a lot of press, everyone here hopes they succeed, but the reality is open electronics without barriers will become common. And its not changing existing human behaviour - people use cell phones, regular phones, TVs, set top boxes, handhelds - we use them all already. The whole world will.

That is changing the way we communicate as human beings. The freedom to communicate is at stake here, and unlike the early days, this battle hasnt been lost at the start. This battle is up for grabs.

Microsoft is not a dominant player, Apple has had some victories, but we can make a difference here. It won’t stay that way forever, there will be winners and losers in each cateogry, just like PCs.

So how do I see things evolving and where can you participate? Our company is one place, the XBox Media Center is being sold here today, and the MythTV project too, there is lots you can participate in.

What I’m demoing today is $200, but it shows the power of electronics today, a powerful device thats very small, silent, and cheap compared to PCs. embedded devices are made from the ground up to do what they do.

so what is an embedded device? anything that’s not a PC, I say. Its defined by people’s usage, not the CPU’s architecture. a lot of electronics now use the x86 architecure, but when you don’t expect it to support all the normal PC peripherals and legacy software, its not really a PC.

why are we at a crossroads right now? Devices today are powerful, the hardware is there, and electronics are now like little PCs. the iphone runs OS X, iPods will be next, the AppleTV too - similarly, GNU/Linux has interludes here, and devices run fully fledged Operating Systems in many respects. Nokia 770s and 800s use a great deal of open source. Sony has YellowDog GNU/Linux on the PS3, and now there are a few attempts to reach consumer awareness, like the OpenMoko cell phone and Neuros (us) that remind users that this product is community evolved, that you get more value out of your hardware that way.

The benefits are obvious. Xbox Media Center’s new features are obvious, and open source seems to be synonymous with getting new features.

The time is ripe, because we see see silicon to support advanced software, and software complexity is overwhelming the traditional Asian supply chain. a PVR in a chip is being brought out by Texas Instruments, its a tiny sized ARM9 chip that can decode and encode video!

The design and engineering of things in Asia was okay, with VCRs that we didn’t really understand, but when its operating systems, frameworks, plugins… they turn to more complete solutions from 3rd parties. Microsoft Windows CE is pushed hard there, and GNU/Linux is in there too - its all in flux.

Devices don’t just do their one function, people are looking at what is available. We use Texas Instruments, they include GNU/Linux thanks to Montovista { }

Where is GNU/Linux winning? Customisation. Embedded devices require differentation, and that means requiring the freedom to be different. Physical Footprint. GNU/Linux can go where CE and OSX can’t, like the really small routers with GNU/Linux, the $20-$50 devices that proprietary stuff can’t go with. Performance. With access to the kernel and frameworks, you can key in high horse powered applications like high definition video.

Where is it losing? Commoditised areas. Vendors take Windows CE, it comes complete, they don’t tweak it. Many Asian manufacturers just follow the exact plans from Microsoft and they’re done. And DRM is a real menace, its illegal to reverse engineer it. “Does it work with ITMS?” “No”.

Its the same with Outlook/Exchange on the enterprise side. The same old threats to free software, we can see them clearly here: DMCA, patents. DRM and proprierary modules are scary. The Free Software Foundation is trying to help all those things with GPLv3.

So here is our product, its what I’m familiar with, so here’s a case study.

Why we did it as a vendor, and a business, to go this way, on this device?

The vision for this is $100, but this is $200. The older one was $100. It is a set top box for recording TV. We wanted something that is open to record TV and allows you to bring TV on to your PC.

Here’s what a PC is like [slide], here’s what electronics is like [slide], and convergence gets best of both. The biggest thing is electronics are not open, you buy them as-is and its a huge limitation and loss of value to the device. if the hardware is powerful, but you can’t use that power, you’re losing out.

Neuros OSD is a dedicated media center, it plays mpeg4, it is silent, and its been on sale since late in 2006. There is no PCI, no keyboard and mouse, we give up all legacy stuff. It has a USB host, a range of peripherals can plug in to that. The CPU is a DM320 from Texas Instruments, an ARM9 with 200Mhz that is stronger than a PC of 7 years ago, and 64Mb ram. It boots the Linux kernel 2.6.15 and a custom GNU/Linux distribution that supports our peripherals. It is open, but has a lot of prorpietary stuff in there too, mostly codecs for which there are no open source equivalents on this architecutre. We license the proprietary stuff. Our communities are replacing them though. We embedded device people need help with that! Wireless USB cards will work, the front has memory card slots so you can record to a SD disk and pop it in a portable video player.

Who cares? Anyone who wants the best functioning, most powerful, evolving devices. we make consumers understand that freedom is not some esoteric thing, its a real thing you use and don’t have to be a programmer to get the benefits from.

Its also open for commercial applications, its a box you can take and customise and resell. An italian company uses them to pipe ads through TVs in dentistry offices, and they used this because of the customisation it afforded them. free software allowed them the freedom to innovate and do business.

What differences are there developing for devices instead of PCs? Not much. multimedia is sucky, the GUI is nano-x with some custom stuff. The GPL application code is hosted on SVN. we have an active community that has company collaboration.

all the open source contributions go into the official releases - users don’t choose the hacked and improved but dodgy release or the official stable one. Theres no copyright assignment agreements, its all GPL based, the community works in that spirit.

To develop for the device, you use a host/target structure, cross-compiling and netbooting the device from your PC. We have a virtual machine image that does the setup for you, you can get started porting apps in 30 minutes!

A tutorial is online to port wget to the device. You don’t need to be a kernel hacker! some headaches here and there, some proprietary parts make our answers dissappointing, but the way those issues will get solved is the participation of more hackers.

So how can you participate? Go to and see the wiki and SVN and so on. the best way to be an embedded device hacker is to be a device user too, so work out what you use. Other projects are XBox Media Center, which is mature and has been worked on for a long time. Openmoko is a phone - very interesting and can change a lot of things. And internet appliance type stuff like the nokia 770 and 800, there is interesting multimedia stuff there too. Also Rockbox, which is not Linux, but a very mature replacement firmware that is free software, for a range of stuff including iPods. MythTV is also mature, its a PC based media center architecure and we share a lot of goals and ideas with them, even if code doesn’t get shared because of technicalities.

So, lets see it in action. Heres a video playing off the SD card, an ASF. recording is to plain mpeg4 files, you can use them anywhere. we have a standard profiles for the iPod and PSP too. You can store them on an external harddisk, share them around your network, put them on your laptop for the plane, keep them on your own FTP server and grab them when you are on holiday - like a slingbox. we have 5 or 6 Summer of Code projects.

The ‘coming soon’ area of the interface highlights what is coming soon, we have a youtube browser, a flickr browser, even a simple video editor for cutting out ads…

Dave Crossland: LOL

Gerv Markham: Can we do ad detection?

Joe Born: yeah, we have someone in the community with an algorithm for that, I don’t know how good it is though.

So we have an electronic programme guide from the tribune media company, zap-to-it, they ceased doing their free guides for MythTV, but we pay for it. we don’t get around mp3 patents, we pay $10 for every device that goes out the door. No one hassles us for using free software codecs, as long as their patent money comes in.

Gerv Markham: how does the Electronic Programme Guide work?

Joe Born: The UK support is there, but its not global - hopefully enough critical mass can solve these problems, but yeah its a big problem for regional sisuations.

Some new features are an xmms2 audio player that you can control with nokia tablets, and our youtube browser. We had no FLV decoder, but a community member ported an open source one and Texas Instruments was very impressed as it wasn’t part of their reference design. Texas Instruments have to do good customer service and hand holding, and this is upside down for them. A manager at TI said, ‘How many people do I have to assign to get some benefit from this open source stuff?’ and I said it was the wrong way to think about it - a lot of these projects do it with no documentation at all, so all he needs to do is publish some documentation and let people handle it themselves.

Audience: You mentioned a ‘coming soon’ section as what we can do. What about the look’n’feel side of things?

Joe Born: sure, the coming soon stuff isn’t just community stuff, we have several teams and one is working on the UI. This product is 6 months old, in that we sold the first unit in fall 2006, so its a new project compared to MythTV or Xbox Media Center. They have been going for years so their UI is already improved over our first version.

Dave Crossland: GPLv3 thoughts?

Joe Born: Scary. We want to make the best product for consumers. I’d like to play DRM stuff. We won’t encumber stuff with DRM, but I’d love to say, “we play anything, whatever DRM you have, we take the hard parts away.” We spoke to the Free Software Foundation folks and we get help from them on our modules. But I have mixed feelings as we need to be succesful, and to do that we need the best product. I’m pleased the Linux kernel is staying with GPLv2 for now because GPLv3 there might be a problem for us.

Gerv Markham: But the DRM section just says that if you do DRM then users can remove it. So your concern is you won’t get a DRM license if it can be removed?

Joe Born: Sure. Its ignorance mainly, I don’t understand the GPLv3 really, there have been multiple revisions. We’re between two immovable objects, Microsoft on one side, the Free Software Foundation on the other. We are between a rock and a hard place. We’ll have to choose between supporting formats out there and free software. At the moment we’re GPL, so everything is compatible. If the GPL forks into 2 versions, that’s gonna be a mess.

I don’t think there are any places where the playing field is more level, and there is no place where the stakes are higher either. All of you, take a hard look at free software and open electronics.

You can make a real difference.


Getting Real

I just noticed that Getting Real has been posted in full online, and is a recommended read.

GPLv3 Launch Speech

Today the recording of the 12 minute speech given by Richard Stallman during the release of the GPLv3 was posted online (Highest Quality (103MB, BitTorrent) Medium Quality (53MB, BitTorrent) and Low Quality (8MB, BitTorrent))

GPLv3 Blowback

Rick Moen posts about how GPLv3 blowback works, and will effect Microsoft.

And it appears some Microsoft designers have ripped off the Ubuntu logo

OpenMoko GNU/Linux is on sale!

On the introduction says,

OpenMoko is a GNU / Linux based open software development platform.

I hope my the efforts on the mailing list to assert the “GNU / Linux” name contributed to that.

A small win for the GNU project that’s nice to see.

The Global Software Industry in Transformation

With the name Galileo Galilei, we associate two of the most important cultural responses to the quandary of possessed physics. The first is an insistence upon freedom from censorship, that is “e pur si muove”, determination to prohibit the ownership of physics by an entity rich enough and powerful enough to define its physics as the only permissible physics, the only available physics for most ordinary people. And second, the first significant attempt in the history of the West to write scientific literature at the state of the art in a vernacular language, accessible to everyone. Galileo Galilei’s decision to publish in Italian is as important as his decision to risk confrontation with the Church, for what it says about the fundamental pillars of free science in the history of the West. Not merely, in other words, an insistence upon the freedom of ideas to work their will in skilled hands, but a determination that the ideas which motivate the world, which explain its behaviour, and which render it controllable, should be universally accessible to people regardless of their ability to acquire enough social surplus to have Latin. We have come, at the end of the 20th, and the beginning of the 21st centuries, to an equivalently important moment in the history of human civilisation. A moment at which the principle of the universalisation of free knowledge becomes, for technical reasons, universally fulfillable. Where it becomes, for technical reasons, possible for the first time in the history of human beings, to bring all useful and beautiful knowledge to everybody without regard to the ability to pay. … Ignorance and cultural deprivation are now preventable. What is the moral case for their continuance?

Wow - a full transcript of Eben Moglen’s latest speech, “The Global Software Industry in Transformation: After GPLv3”. Another excellent speech, and what’s notable about Eben’s speeches, to me, is that they are substantially different in their structure and examples and language every time.

I’d like to collect a bunch of the soundbite-worthy chunks, like the above and “Here, we made this. Would you like some? Take it.” as ready-made ingredients into some decent short film propaganda.

Latest Copyright versus Community lectures

Richard Stallman just gave another round of Copyright versus Community speeches, where he outlines the free software movement, and how it might be applied to the other kinds of literary works. (Yes, software is a literary work, not a product). He outlines three broad categories of works which is a useful model for thinking about any particular kind of work, as well as the model presented by untangling the confusion of the term “Intellectual Property”.