Planet Fellowship (en)

Monday, 20 April 2015

WebRTC video from mini-DebConf Lyon, France - fsfe | 12:15, Monday, 20 April 2015

Thanks to the Debian France community for putting on another very successful mini-DebConf in Lyon recently.

It was a great opportunity to meet more people for the first time and share ideas.

On the first day, I gave a talk about the current status of WebRTC and the Debian WebRTC portal. You can now watch the video of the talk online.

Unfortunately, we had some wifi problems that delayed the demonstration but we did eventually see it work successfully towards the end of the talk.

Friday, 17 April 2015

Libreoffice Design Session: Inserting a Chart

bb's blog | 07:31, Friday, 17 April 2015

The Libreoffice UX team presents ideas for an updated workflow to insert charts. We show mockups how the dialog may look like and idea how to tweak your graph to perfection. Probably everyone working with Libreoffice Calc is using charts. And maybe some wondered why the dialog to insert a chart has a wizard-like workflow [...]

Thursday, 16 April 2015

Debian Jessie release, 100 year ANZAC anniversary - fsfe | 17:48, Thursday, 16 April 2015

The date scheduled for the jessie release, 25 April 2015, is also ANZAC day and the 100th anniversary of the Gallipoli landings. ANZAC day is a public holiday in Australia, New Zealand and a few other places, with ceremonies remembering the sacrifices made by the armed services in all the wars.

Gallipoli itself was a great tragedy. Australian forces were not victorious. Nonetheless, it is probably the most well remembered battle from all the wars. There is even a movie, Gallipoli, starring Mel Gibson.

It is also the 97th anniversary of the liberation of Villers-Bretonneux in France. The previous day had seen the world's first tank vs tank battle between three British tanks and three German tanks. The Germans won and captured the town. At that stage, Britain didn't have the advantage of nuclear weapons, so they sent in Australians, and the town was recovered for the French. The town has a rue de Melbourne and rue Victoria and is also the site of the Australian National Memorial for the Western Front.

Its great to see that projects like Debian are able to span political and geographic boundaries and allow everybody to collaborate for the greater good. ANZAC day might be an interesting opportunity to reflect on the fact that the world hasn't always enjoyed such community.

Monday, 13 April 2015

You can come to the Randa Meetings 2015 – Please register now

Mario Fux | 19:29, Monday, 13 April 2015

The dates for the sixth edition of the Randa Meetings are set: Sunday, 6th to Sunday 13th of September 2015. The first Sunday will be the day of arrival and the last Sunday accordingly the day of departure.

So what about you? If you know about Qt and touch gesture support, want to bring your KDE application to Android and Co, plan to work on KDE infrastructure for mobile systems, are a UI or UX designer for mobile and touch interfaces, want to make your software more accessible or just want to work on your already ported KDE application please register as soon as possible on our Sprints page.

The registration is open until the 13th of May 2015. Please add your estimated travel cost and what you plan to work on in Randa this September. You don’t need to include any accommodation costs as we organize this for you (see the Randa Meetings wiki page for further information about the building). After this date we will present a budget and work on a fundraiser (together with you) to make it possible for as many people as possible to come to Randa.

If there are any questions or further ideas don’t hesitate to contact me via email or on IRC in #randa.

flattr this!

Friday, 10 April 2015

OpenFOAM – Open Computational Fluid Dynamics

Seravo | 10:27, Friday, 10 April 2015

OpenFOAM (Open source Field Operation And Manipulation) is a numerical CFD (Computational Fluid Dynamics) solver and a pre/postprocessing software suite.

Special care has been taken to enable automatic parallelization of applications written using OpenFOAM high-level syntax. Parallelization can be further extended by using a clustering software such as OpenMPI that distributes simulation workload to multiple worker nodes.

Pre/post-processing tools like ParaView enable graphical examination of the simulation set-up and results.

The project code is free software and it is licensed under the Gnu General Public License and maintained by the OpenFOAM Foundation.

A parellel version called OpenFOAM-extend  is a fork maintained by Wikki Ltd that provides a large collection of community generated code contributions that can be used with the official OpenFOAM version.

What does it actually do?

OpenFOAM is aimed at solving continuum mechanical problems. Continuum mechanics deals with the analysis of kinematics and the mechanical behavior of materials modeled as a continuous mass rather than as discrete particles.

OpenFOAM has an extensive range of features to solve complex gas/fluid flows involving chemical reactions, turbulence, heat transfer, solid dynamics, electromagnetics and much more!

The software suite is used widely in the engineering and scientific fields concerning simulations of fluid flows in pipes, engines, combustion chambers, pumps and other diverse use cases.


How is it used?

In general, the workflow adheres to the following steps:

  • pre-process
    • physical modeling
    • input mesh generation
    • visualizing the input geometry
    • setting simulation parameters
  • solving
    • running the simulation
  •  post-process
    • examining output data
    • visualizing the output data
    • refining the simulation parameters
    • rerunning the simulation to achieve desired results

Later we will see an example of a 2d water flow simulation following these steps.


What can Seravo do to help a customer running OpenFOAM?

Seravo can help your organization by building and maintaining a platform for running OpenFOAM and related software.

Our services include:

  • installing the host platform OS
  • host platform security updates and maintenance
  • compiling, installing and updating the OpenFOAM and OpenFOAM-extend suites
  • cluster set-up and maintenance
  • remote use of visualization software

Seravo has provided above-mentioned services in building a multinode OpenFOAM cluster to its customers.


OpenFOAM example: a simplified laminar flow 2d-simulation of a breaking water dam hitting an obstacle in an open container

N.B. Some steps are omitted for brevity!

Input files for simulation are ascii text files with defined open format.

Inside the working directory of a simulation case, we have many files defining the simulation environment and parameters, for example (click filename for sample view):

  • constant/polyMesh/blockMeshDict
    • defines the physical geometries; walls, water, air
  • system/controlDict
    • simulation parameters that define the time range and granularity of the run
  • constant/transportProperties
    • defines material properties of air and water used in simulation
  • numerous other control files define properties such as gravitational acceleration, physical properties of the container materials and so on

In this example, the simulated timeframe will be one second with output snapshot every 0,01 seconds.

OpenFOAM simulation input geometry

OpenFOAM simulation input geometry


After input files have been massaged to desired consistency, commands are executed to check and process the input files for actual simulation run:

  1. process input mesh (blockMesh)
  2. initialize input conditions (setFields)
  3. optional: visually inspect start conditions (paraFoam/paraview)

Solver application in this case will be OpenFOAM provided “interFoam”, which is a solver for 2 incompressible fluids. It tracks the material interfaces and mesh motion.

After setup, the simulation is executed by running the interFoam command (sample output).

OpenFOAM cluster running full steam on 40 CPU cores.

OpenFOAM cluster running simulation full steam on 40 CPU cores.

After about 40 seconds, the simulation is complete and results can be visualized and inspected with ParaView:

Simulation output at 0 seconds.

Simulation output at 0 seconds.

Simulation output at 0,2 seconds.

Simulation output at 0,2 seconds.


And here is a fancy gif animation of the whole simulation output covering one second of time:



Thursday, 09 April 2015

Never fly Etihad again? - fsfe | 15:05, Thursday, 09 April 2015

This is the first time we've flown out to Australia with Etihad and it may also be the last.

We were due to fly back into Europe at CDG and head down to Lyon for the mini-DebConf this weekend.

departure board, 18 hour delay for Paris

Lets look at how our Etihad experience has worked out:

21:00 UTC Tuesday - waking up on Wednesday morning in Melbourne (UTC+10)

13:00 UTC Wednesday - leaving Melbourne about 11pm Wednesday night, a 12-13 hour flight to Abu Dhabi. We had heard about the air traffic control strikes in France (where we are going) and asked the airline if we should fly and they told us everything would be OK.

02:30 UTC Thursday - touchdown in Abu Dhabi, 6:30 am local time. Go to the transfer counter to ask for our boarding passes to CDG. At this stage, we were told that the connecting flight to CDG had been delayed 20 hours due to French strikes. As we are trying to reach Lyon for the mini-DebConf this weekend, we asked if we could leave Abu Dhabi on a 09:00 flight to Geneva. The Etihad staff told us to contact our travel agent (the flight was booked through Expedia) and for the next hour everybody's time was wasted making calls to Expedia who kept telling us to speak to Etihad. Whenever the Etihad customer service staff tried to speak to Expedia, the Expedia call center would hang up.

Eventually, the Etihad staff told us that the deadline for putting us on the Geneva flight had passed and we would be stuck in Abu Dhabi for at least 20 hours.

For flights to and from Europe, airlines have a responsibility to arrange hotels for passengers if there is a lengthy delay. If the airline is at fault, they must also pay some extra cash compensation but for a strike situation that is not applicable.

Etihad has repeatedly fobbed us off. Initially we were given vouchers for Burger King or a pizza slice and told to hang around the transfer counter.

By about 12:00 UTC (4pm local time, nine hours of waiting around the transfer counter) there was still no solution. One passenger was so upset that the airport security were called to speak to him and he was taken away. The airline staff kept giving excuses. Some passengers had been sent to a hotel but others left behind. I asked them again about our hotel and they kept trying to fob me off.

Faced with the possibility that I would miss two nights of sleep and eight hours time difference coming into Europe, I continued asking the Etihad staff to own up to their responsibilities and they eventually offered us access to their airport lounge. We discovered some other passengers in the lounge too, including the passenger who had earlier been escorted away by security.

This is unlike anything we've experienced with any other airline.

At every opportunity (the check-in at Melbourne, or when the Geneva flight was boarding), the airline has failed to make arrangements that would have avoided cost and inconvenience.

Assuming the flight goes ahead with a 20 hour delay, we will arrive in CDG some time Friday morning and not really sleep in a proper bed again until Friday night, about 70 hours after getting up in Melbourne on Wednesday morning. Thanks Etihad, you are a waking nightmare.

The airline has been evasive about how they will deal with our onward travel from CDG to Lyon. We had booked a TGV train ticket already but it is not valid after such a long delay and it seems quite possible that trains will be busier than usual thanks to the air traffic control strike. So we don't even know if we will be loitering around a Paris airport or railway station for hours on Friday and nobody from the airline or Expedia really seems to care.


The only conclusion I can reach from this experience is that Etihad can't be trusted, certainly not for long journies such as Australia to Europe. Having flown through Singapore, Kuala Lumpur and Hong Kong, I know that air passengers have plenty of options available and there are many airlines that do go the extra mile to look after passengers especially on such long journeys. The airline missed opportunities to re-route us at every opportunity. It looks like they help some passengers (like those who did get to hotels) but leave many others high and dry just to stay within their budget.

Wednesday, 08 April 2015

EOMA-68: The Return

Paul Boddie's Free Software-related blog » English | 16:37, Wednesday, 08 April 2015

It is hard to believe that almost two years have passed since I criticised the Ubuntu Edge crowd-funding campaign for being a distraction from true open hardware initiatives (becoming one which also failed to reach its funding target, but was presumably good advertising for Ubuntu’s mobile efforts for a short while). Since then, the custodians of Ubuntu have pressed on with their publicity stunts, the most recent of which involving limited initial availability of an Ubuntu-branded smartphone that may very well have been shipping without the corresponding source code for the GPL-licensed software being available, even though it is now claimed that this issue has been remedied. Given the problems with the same chipset vendor in other products, I personally cannot help feeling that the matter might need more investigation, but then again, I personally do not have time to chase up licence compliance in other people’s products, either.

Meanwhile, some genuine open hardware initiatives were mentioned in that critique of Ubuntu’s mobile strategy: GTA04 is the continuing effort to produce a smartphone that continues the legacy of the Openmoko Neo FreeRunner, whose experiences are now helping to produce the Neo900 evolution of the Nokia N900 smartphone; Novena is an open hardware laptop that was eventually successfully crowd-funded and is in the process of shipping to backers; OpenPandora is a handheld games console, the experiences from which have since been harnessed to initiate the DragonBox Pyra product with a very similar physical profile and target audience. There is a degree of collaboration and continuity within some of these projects, too: the instigator of the GTA04 project is assisting with the Neo900 and the Pyra, for example, partly because these projects use largely the same hardware platform. And, of course, GNU/Linux is the foundation of the software for all this hardware.

But in general, open hardware projects remain fairly isolated entities, perhaps only clustering into groups around particular chipsets or hardware platforms. And when it comes to developing a physical device, the amount of re-use and sharing between projects is perhaps less than we might have come to expect from software, particularly Free Software. Not that this has necessarily slowed the deluge of boards, devices, products and crowd-funding campaigns: everywhere you look, there’s a new Arduino variant or something claiming to be the next big thing in the realm of the “Internet of Things” (IoT), but after a while one gets the impression that it is the same thing being funded and sold, over and over again, with the audience probably not realising that it has all mostly been done before.

The Case for Modularity

Against this backdrop, there is one interesting and somewhat unusual initiative that I have only briefly mentioned before: the development of the EOMA-68 (Embedded Open Modular Architecture 68) standard along with products to demonstrate it. Unlike the average single-board computer or system-on-module board, EOMA-68 attempts to define a widely-used modular computing unit which is also a complete computing device, delegating input (keyboards, mice, storage) and output (displays) to other devices. It has often been repeated that today phones are just general-purpose computers that happen to be able to make calls, and the same can be said for a lot of consumer electronics equipment that traditionally were either much simpler devices or which only employed special-purpose computing units to perform their work: televisions are a reasonably illustrative example of this.

And of course, computers as we know them come in all shapes and sizes now: phones, media players, handhelds, tablets, netbooks, laptops, desktops, workstations, and so on. But most of these devices are not built to be upgraded when the core computing part of them becomes obsolete or, at the very least, less attractive than the computing features of newer devices, nor can the purchaser mix and match the computing part of one device with the potentially more attractive parts of another: one kind of smart television may have a much better screen but a poorer user interface that one would want to replace, for example. There are workarounds – some people use USB-based “plug computers” to give their televisions “smart” capabilities – but when you buy a device, you typically have to settle for the bundled software and computing hardware (even if the software might eventually be modifiable thanks to the role of the GPL, subject to constraints imposed by manufacturers that might prevent modification).

With a modular computing unit, the element of choice is obviously enhanced, but it also helps those developing open hardware. First of all, the interface to the computing unit is well-defined, meaning that the designers of a device need not be overly concerned with the way the general-purpose computing functionality is to be provided beyond the physical demands of that particular module and the facilities provided by it. Beyond such constraints, being able to rely on a tested functional element, designers can focus on the elements of their device that differentiate it from other devices without having to master the integration of their own components of interest with those required for the computing functionality in one “make or break” hardware design that might prove too demanding to get right first time (or even second or third time). Prototyping complicated circuit designs can quickly incur considerable costs, and eliminating complexity from what might be described as the “peripheral board” – the part providing the input and output capabilities and the character of a particular device – not only reduces the risk of getting things wrong, but it could make the production of that board cheaper, too. And that might open up device design to a broader group of participants.

As Nico Rikken explains, EOMA-68 promises to offer benefits for hardware designers, software developers and customers. Modularity does make sense if properly considered, which is perhaps why other modularity initiatives like Phonebloks have plenty of critics even though they share the same worthy objectives of reducing waste and avoiding device obsolescence: with vague statements about modularity and the hint of everything being interchangeable and interoperating with everything, one cannot help be skeptical about the potential complexity and interoperability problems that could result, not to mention the ergonomic issues that most people can easily relate to. By focusing on the general-purpose computing aspect of modularity, EOMA-68 addresses the most important part of the hardware for Free Software and delegates modularity elsewhere in the system to other initiatives that do not claim to do it all.

A Few False Starts

Unfortunately, not everything has gone precisely according to schedule with EOMA-68 so far. After originally surfacing as part of an initiative to make a competitive ARM-based netbook, the plan was to make computing modules and “engineering boards” on the way to delivering a complete product, and the progress of the first module can be followed on the Allwinner A10 news page on the Rhombus Tech Web site. From initial interest from various parties at the start of 2012, and through a considerable amount of activity, by May 2013, working A10 boards were demonstrated running Debian Wheezy. And a follow-up board employing the Allwinner A20 instead of the A10 was demonstrated running Debian at the end of October 2014 as part of a micro-desktop solution.

One might have thought that these devices would be more widely available by now, particularly as development began in 2012 on a tablet board to complement the computing modules, with apparently steady progress being made. Now, the development of this tablet was driven by the opportunity to collaborate with the Vivaldi tablet project, whose own product had been rendered unusable for Free Software usage by the usual product iteration performed behind the scenes by the contract manufacturer changing the components in use without notice (as is often experienced by those buying computers to run Free Software operating systems, only to discover that the wireless chipset, say, is no longer one that is supported by Free Software). With this increased collaboration with KDE-driven hardware initiatives (Improv and Vivaldi), efforts seemingly became directed towards satisfying potential customers within the framework of those other initiatives, so that to acquire the micro-engineering board one would seek to purchase an Improv board instead, and to obtain a complete tablet product one would place an advance order for the Vivaldi tablet instead of anything previously under development.

Somehow during 2014, the collaboration between the participants in this broader initiative appears to have broken down, with there undoubtedly being different perspectives on the sequence of events that led to the cancellation of Improv and Vivaldi. Trawling the mailing list archives gives more detail but not much more clarity, and it can perhaps only be said that mistakes may have been made and that everybody learned new things about certain aspects of doing business with other people. The effect, especially in light of the deluge of new and shiny products for casual observers to purchase instead of engaging in this community, and with many people presumably being told that their Vivaldi tablet would not be shipping after all, probably meant that many people lost interest and, indeed, hope that there would be anything worth holding out for.

The Show Goes On

One might have thought that such a setback would have brought about the end of the initiative, but its instigator shows no sign of quitting, probably because genuine hardware has been made, and other opportunities and collaborations have been created on the way. Initially, the focus was on an ARM-based netbook or tablet that would run Free Software without the vendor neglecting to provide the complete corresponding source for things like the Linux kernel and bootloader required to operate the device. This requirement for licence compliance has not disappeared or diminished, with continuing scrutiny placed on vendors to make sure that they are not just throwing binaries over the wall.

But as experience was gained in evaluating suitable CPUs, it was not only ARM CPUs that were found to have the necessary support characteristics for software freedom as well as for low power consumption. The Ingenic jz4775, a sibling of the rather less capable jz4720 used by the Ben NanoNote, uses the MIPS architecture and may well be fully supported by the mainline Linux kernel in the near future; the ICubeCorp IC1T is a more exotic CPU that can be supported by Free Software toolchains and should be able to run the Linux kernel in addition to Android. Alongside these, the A20 remains the most suitable of the options from Allwinner, whose products have always been competitively priced (which has also been a consideration), but there are other ARM derivatives that would be more interesting from a vendor cooperation perspective, notably the TI AM389x series of CPUs.

Meanwhile, after years of questions about whether a crowd-funding campaign would be started to attract customers and to get the different pieces of hardware built in quantity, plans for such a campaign are now underway. While initial calls for a campaign may have been premature, I now think that the time is right: people have been using the hardware already produced for some time, and considerable experience has been amassed along the way up to this point; the risks should be substantially lower than quite a few other crowd-funding campaigns that seem to be approved and funded these days. Not that anyone should seek to conceal the nature of crowd-funding and the in-built element of risk associated with such campaigns, of course: it is not the same as buying a product from a store.

Nevertheless, I would be very interested to see this hardware being made, and I am even on record as having said so. Part of this is selfishness: I could do with some newer, quieter, less power-consuming hardware. But I also think that a choice of different computing modules, supporting Free Software operating systems out of the box, with some of them candidates for FSF endorsement, and offering a diversity of architectures, would be beneficial to a sustainable computing infrastructure in the longer term. If you also think so, maybe you should follow the progress of EOMA-68 over the coming weeks and months, too.

Friday, 17 April 2015

Libreoffice Design Session: Shapes

bb's blog | 07:31, Friday, 17 April 2015

The Libreoffice UX team presents two proposals on how to access shapes from the sidebar, with the goal to unify the look and feel of Libreoffice tools and to get more space for additional categories of shapes. Libreoffice Draw was treated somewhat novercally in last time. But we didn’t forget it and started to pimp [...]

Tuesday, 07 April 2015

Back on the web…

Riccardo (ruphy) Iaconelli - blog | 16:08, Tuesday, 07 April 2015


It’s been almost 10 years. Well, I am cheating a little bit, since my real other blog, which I used until 2011 or so, has just moved somewhere I can’t reach anymore, and the DNS just points to the nothingness… ;-)  (actually, if anyone is aware of where/who could have old files, I’d be really grateful)

Much has happened in the past 3/4 years. I turned 24. I co-founded a company (Ispirata) and left it one year after foundation, giving up all my links and participation in it, in order to finish my studies. A few months ago I got a Bachelor of Science in Physics, and two weeks ago I started my Master Degree, in the top Italian university for scientific research. I have been doing research at CERN and launched WikiFM, an open science/training project, now actively used by some of the top Italian universities (although we’re getting international as I write!), which got patronized by Wikimedia Italy, and recently became the first project incubated by KDE. I also moved to Sweden for a few months to refine my studies and got my own place in Milano.

WikiFM is the project that is currently giving me the greatest satisfactions, with some recent big happening which will greatly contribute to its success (teaser!). Much of the content is unfortunately -for now- in Italian only, but as I stated above we are deep in the process of internationalizing it (with the help of the great KDE Sysadmin team), in an implementation which will become very similar to Wikimedia’s multidomain approach. For the English-speaking user (and for the broader community) I am writing a proper announcement. However, I don’t want to spoil my next post, so I will keep the publication of the announcement for as soon as we manage to open; the deadline for this is the 17th of April.

If, in the meantime, there is anybody who is knowledgeable about CSS and/or Mediawiki (development and/or sysadminning) and/or Python+LaTeX, and has a couple of hours (I promise, no more than that) to spend to help a Free Software project, that would speed things up immensely. Just comment here, drop me a line, or, even better, join the WikiFM mailing list (created today!).

That was all for now, KDE! I missed you a lot. <3 =)

Saturday, 04 April 2015

Spread the message with Free Software merchandise

I love it here » English | 10:46, Saturday, 04 April 2015

Bag with the slogan "There is no cloud, just other people's computers"

Our "There is no cloud, just other people's computers" bag

For those of you, who are not not subscribed to our newsletter: During the last weeks, many people ordered our “There is no cloud, just other people’s computers” stickers. Now Rich Folsom wrote a Chromium Browser add-in, which converts “the cloud” to “other people’s computers”.

Since so many people like the slogan, we now also have the corresponding “There is no cloud, just other people’s computers” bags in our webshop. Furthermore we have a new Open Standard t-shirt with robots in fitted light blue or a non-fitted khaki, the “I love Free Software” t-shirt in light blue, or a fitted “Hacking for Freedom” t-shirt in grey, as well as the metallic “GNU/Linux inside” stickers and a golden GNU pin.

If you want to spread the Free Software message at work, conferences, or when you are shopping, you can order the equipment on our merchandise page.

Parsing Emacs OrgMode files, EU patent debate, and vacation!

Creative Destruction & Me » FLOSS | 09:00, Saturday, 04 April 2015

After starting the year with two rather busy months, I planned to take it easy a bit. Such an optimistic plan of course never works out as intended… In times like these, it really helps that I love my job(s). It included a trip to Brussels to present the Open Source perspective on the role of patents at the European Commission Joint Research Center. Between office hunting and strategy workshops, there was also some time to hack on the OrgModeParser! See below.

I already mentioned earlier the plans to present about the situation of the Open Source community as a consumer of the patent system at the conference on “Innovation in a European Digital Single Market – The Role of Patents” in Brussels on March 17. FSFE, OpenForum Europe, colleagues at OIN and fellow Open Source supporters provided great feedback for the presentation. Many thanks to everybody who contributed! In the end, the concept for the presentation (which was a short introduction to a following panel discussion) was to explain five concrete difficulties the patent system causes in a collaborative production environment. The slides are available on the conference site. I hope to find some time to write up the presentation in a future blog post.

<script async="async" charset="utf-8" src=""></script>

Sage joined the Open Invention Network. OIN is the world’s largest patent non-aggression community with the mission to protect Linux and Open Source. It speaks for the credibility that patent non-aggression has achieved and for how OIN represents that idea in the Open Source space when a publicly listed company that grew to success long before Linux really took off subscribes to it. Thanks, Sage! More large and small companies are considering this step. Your company should do so, too. If you have any questions, feel free to contact me.

The Endocode office hunt continues. We visited quite a number of available spaces, but the market is contested and suitable space is hard to come by. We are trying to have everybody involved have a say in the choice, too. This naturally leads to some quite lively discussions. An essential goal is to create a space that serves well as the home of the creative productivity our team enjoys. This includes flexible ways of working together, a mix of functional and motivational (read: fun) requirements, and generally an inviting atmosphere that one can look forward to when getting up in the morning. I think it is worth it to be picky. Hopefully we can invite for an office warming party soon…

We also continued with our series of Endocode strategy workshops. Our work revolves around Open Source form different angles – software engineering, DevOps and contributor relations. Analysing these different fields to identify a value chain that ties them all together is in a way intuitive for us that “grew up” in communities, but there is a significant gap in understandings and values from a business strategy perspective. But there must be a way, considering that Open Source is in essence a coordination mechanism for collaborative production, which is in turn a purely economic concept. We are making good progress, but I do expect it to still take significantly more effort. Still, such thought experiments are rather engaging and a great challenge to be part of.

Then, finally, I found some time to hack on a fun project of mine (woohoo!) A while ago I came up with the completely insane idea to access the content of Emacs OrgMode files from independent programs. Emacs OrgMode is hands-down about the best tool for the collection of notes, ideas, tasks, for tracking time, for writing content, and so much more. Nobody would ever argue about that :-) I wanted to be able to read OrgMode files in the programs I write, which are usually implemented in C++ and Qt. The code of OrgModeParser is on Github and LPGL 3 licensed. This week, this yielded a first working version and a demo program that integrates clocked work time data into the bash prompt:

The yellow line in the screenshot is the output of the OrgModeParser clock time demo, embedded into the bash prompt. It shows the currently clocked task, the running time of the current session, and on the right side of the screen the time clocked today and this week. One curiosity that triggered this was the inclusion of lambda functions into C++ with the recent updates of the language standard. There were quite a number of discussions of how the new C++ better supports functional programming approaches and is closer to some concepts of scripting languages, which I wanted to try out. It leads to some really interesting code:


//Find all clocklines that are incomplete (not closed):
auto const notCompleted = [](const ClockLine::Pointer& element) {
    return element.dynamicCast<CompletedClockLine>() == 0;
auto clocklines = findElements<ClockLine>(toplevel_, -1, notCompleted);
//Sort by start time, to determine the latest task that was started:
auto const startedLater = [](const ClockLine::Pointer& left, const ClockLine::Pointer& right) {
    return left->startTime() > right->startTime();
sort(clocklines.begin(), clocklines.end(), startedLater);

This finds all started, but not completed clock lines in an OrgMode file and sorts them by the start time with the last clocked-in task first in the list. Lambdas and automatic typing are a huge step forward in readability, and also from a practical point of view: The compiler prevents many mistakes, and of course a breakpoint can be set in the body of a lambda function. Good stuff, and the parser is fast enough to process a 100kByte TODO list in mere milliseconds, so it can be integrated into a typical bash prompt like this:


PS1="$PS1\$(OrgModeParser_ClockTimeDemo -p -c\${COLUMNS} ~/Org/\n"

The code builds and install with CMake and should compile on any recent Linux distribution or OSX installation. It requires Qt 5. I haven’t tried building it on Windows. If you are like me and occasionally (ahem :-) ) forget to clock into the task you currently work on, this may be of help. It is however meant to be a demo of what the parser can do: load an OrgMode file into a data structure that can be queried or filtered, updated and saved out again. Potential applications include embedding OrgMode data into GUI applications, or creating or reading TODO or CLOCK entries from other external tools like time trackers. Or even, which is one of the main long term motivations, enable integration with online project management tools like Redmine.

Next week I will be on a family vacation, which includes being offline. Offline as in no internet, no power outlets, and most of the time not even a hint of phone reception. I am so looking forward to it. I will check back in on March 13. Happy Easter holidays!

Filed under: Coding, CreativeDestruction, English, FLOSS, KDE, OSS, Qt

Friday, 03 April 2015

Open Hardware and Free Software: Not Just For The Geeks

Paul Boddie's Free Software-related blog » English | 22:09, Friday, 03 April 2015

Having seen my previous article about the Fairphone initiative’s unfortunate choice of technologies mentioned in various discussions about the Fairphone, I feel a certain responsibility to follow up on some of the topics and views that tend to get aired in these discussions. In response to an article about an “open operating system” for the Fairphone, a rather solid comment was made about how the initiative still seems to be approaching the problem from the wrong angle.

Because the article comments have been delegated to a proprietary service that may at some point “garbage-collect” them from the public record, I reproduce the comment here (and I also expanded the link previously provided by a link-shortening service for similar and other reasons):

You are having it all upside down.
Just make your platform open instead of using proprietary chipsets with binary blobs! Then porting Firefox OS to the Fairphone would be easy as pie.

Not listening to the people who said that only free software running on open hardware would be really fair is exactly what brought you this mess: Our approach to software and ongoing support for the first Fairphones
It is also why I advised all of my friends and acquaintances not to order a Fairphone until it becomes a platform that respects user freedom. Turns out I was more than right.
If the Fairphone was an open platform that could run Firefox OS, Replicant or pure Debian, I would tell everybody in need of a cellphone to buy one.

I don’t know the person who wrote this comment, but it is very well-formulated, and one wouldn’t think that there would be much to add. Unfortunately, some people seem to carry around their own misconceptions about some of the concepts mentioned above, and unfortunately, they are quite happy to propagate those misconceptions as if they were indisputable facts. Below, I state the real facts in the headings and quote each one of the somewhat less truthful misconceptions for further scrutiny.

Open Hardware and Free Software is for Everyone

Fairphone should not make the mistake of producing a phone for geeks. Instead, it should become a phone for everyone.

Just because people have an opinion about technology and wish to see certain guarantees made about the nature of that technology does not mean that the result is “for geeks”. In fact, making the hardware open means that more people can figure things out about it, improve it, understand it, and improve the way it works and the software that uses it. Making the software truly open means that more people can change it, fix it, enhance it, and extend the usable life of the device. All of this benefits everyone, whereas closed hardware and proprietary software ultimately benefit only the small groups of people who respectively designed the device and wrote the software, both of whom being very likely to lose interest in sustaining the life of that product as soon as they have another one they want to sell you. (And often, in the case of the hardware, as soon as it leaves the factory.)

User Freedom Means Exactly User Freedom

‘User freedom’ is often used when actually ‘developers freedom’ is meant. It is more of an ideology.

Incorrect! Those of us who use the term Free Software know exactly what we mean: it is the freedom of the end-user to exercise precisely those privileges that have resulted in the work being produced and delivered to them. Now, there are people who advocate “permissive licences” that do favour developers in that they allow people to use the work of others and to then provide a piece of software under conditions that grants the end-user only limited privileges, taking away those privileges to see how the entire work is constructed, along with those that allow the entire work to be improved and shared. Whether one sees either of these as an ideology, presumably emphasising one’s own “pragmatism” in contrast, is largely irrelevant because the genuine pragmatism involved in Free Software and the propagation of a broader set of privileges actually delivers sustainability: users – genuine end-users, not middle-men – get the freedom to participate in how the product turns out, and crucially, how it lives on after the original producer has decided to go off and do something else.

Openness Does Not Preclude Fanciness (But Security Requires Openness)

What people want is: user friendly interface, security/privacy, good specs and ability to install apps and games. [...] OpenSource is a nice idea, but has its disadvantages too: who is caring about quality?

It’s just too easy for people to believe claims about privacy and security, even after everybody found out that they were targets of widespread surveillance, even after various large corporations who presumably care about their reputations have either lost the personal details of their users to criminals or have shared those details with others (who also have criminal or unethical intent), and when believing the sales-pitch about total privacy and robust security, those people will happily reassure themselves and others that no company would allow its reputation to be damaged by any breach of privacy or security! But there are no guarantees of security or privacy if you cannot trust the systems you use, and there is no way of trusting them without being able to inspect how they work. More than ever, people need genuine guarantees of security and privacy – not reassurances from salesmen and advertisers – and the best way to start off on the path towards such guarantees is to be able to deploy Free Software on a device that you fully control.

And as for quality, user-friendliness and all the desirable stuff: how many people use products like Firefox in its various forms every single day? Such Free Software solutions have not merely set the standard over the years, but they have kept technologies like the Web relevant and viable, in stark contrast to proprietary bundled programs like Internet Explorer that have actually impaired technological and social progress, with “IE” doing its bit by exhibiting a poor record of adherence to standards and a continuous parade of functionality and security bugs, not to mention constant usability frustrations endured by its unfortunate (and frequently involuntary) audience of users.

Your Priorities Make Free Software Important

I found the following comment to be instructive:

For me open source isn’t important. My priorities are longevity/updates, support, safety/privacy.

The problem is this: how can you guarantee longevity, updates, support, safety and privacy without openness? Safety and privacy would require you to have blind trust in someone whose claims you cannot verify. Longevity, updates and support require you to rely on the original producer’s continued interest in the product that you have just purchased from them, and should it become more profitable for them to focus on other products (that they might want you to buy instead of continuing to use the one you have), you might be able to rely on the goodwill of that producer to transfer their responsibilities to others to do the thankless tasks of maintenance and support. But it may well be the case that no amount of money will be able to keep that product viable for you: the producer may simply refuse to support it or to let others support it. Perhaps some people may step in and reverse-engineer the product and make an effort to keep it viable, but wouldn’t it be better to have an open product to start with, where people can choose how it is maintained – and thus sustained – for as long as people still want to use it?

Concepts like open hardware and Free Software sound like topics for the particularly-interested, but they provide the foundations for those topics of increasing interest and attention that people claim to care so much about. Everybody deserves things like choice, democracy, privacy, security, safety, control over their own lives and destinies, and so on. Closed hardware and proprietary software may be used on lots of devices, and people may be getting a lot of use out of those devices, but the users of those devices enjoy the benefits only as long as it remains in the interests of the producers of those devices and the accompanying software to allow them to do so. Furthermore, few or none of those users can be sure whether any of those important things – their rights – are being impaired by their use of those devices. Are their communications being intercepted, collected, analysed? Few people would ever know.

Free Software and open hardware empower their users with the control that proprietary technologies deny their users. But shouldn’t everybody be able to benefit from such control? That’s why a device that is open hardware and which runs Free Software really is for everyone, not just for “geeks”.

Saturday, 28 March 2015

Open letter to Apple advertisement

FSFE Fellowship Vienna » English | 09:50, Saturday, 28 March 2015

Dear every mother counts team,

Thank you for your work concerning health care for mothers. You have taken on an important and demanding challenge.

Please consider my following concern regarding Christy Turlington’s Apple advertisement:

I can understand why a marketing contract with Apple must look tempting. The deal might bring in a considerable amount of money. This can potentially fund urgently needed projects. But Christy Turlington’s prominent involvement in the promotion of Apple’s iWatch unfortunately still might counteract important aspects of your work.

Most mothers with insufficient access to health care probably only lack the necessary funds for getting it. Proper education would open a lot of doors in that regard. Modern education heavily depends on technology. Therefore Access to affordable technology and knowledge about it is of paramount importance.

Unfortunately open access is a foreign concept to Apple and it locks down everything it offers. Everything they develop is only made available to paying customers. Nothing is shared freely. They work with aggressive logical patents, closed standards and massive legal restrictions. Therefore, others are actively hindered in offering similar services or products under fairer conditions. [1]

Only freely available, well documented technology without restrictions gives everyone opportunities. Countless people around the globe discover, use, analyse, adapt and spread free software every day, regardless of whether they want to dig a well or organise their local resources more efficiently. If technology has no restrictions built in and comes without legal limitations attached to it, it ceases to be an instrument of power. Privileges are a given and don’t need to be sold or restrained. Apple itself has made heavy use of free software to built its own operating system. But unlike others, who not only took but contributed something, Apple decided to use the free software stack and then lock up what they had derived from it. This surely isn’t in line with’s philosophy.

Free software is an ethical approach to technology, fostering sharing and caring. It has inspired many other movements. It’s main concerns are independence and empowerment. So if you feel like teaming up with institutions in the field of technology, please consider ethically and socially aware players like Mozilla or even organisations like the Free Software Foundation!

My second concern about this marketing arrangement is privacy. Of course Apple promises everything the public wants to hear, but as long as it doesn’t give full public access to what is going on in its products, those are just nice words. Not even governments are allowed to check if Apple’s claims are met in reality. Of course this isn’t an Apple-only problem, but considering the unprecedented scope of data collection by the iWatch, this is a whole new dimension of surveillance acted out by a private, uncontrolled entity.

Free software on the other hand, is trustworthy because one of its merits is its open source. What it does is completely transparent and it can be audited by anyone who is interested. Even if you find features you don’t like, you are free to adapt, remove or just disable them. If someone has an idea how to improve on existing free software, that’s great, because it’s meant to be adapted and shared freely.

Please apply your principles in a wider context!

All the best for your work,
Franz Gratzer
fellow of the Free Software Foundation Europe (FSFE)
and an animal rights activist


Friday, 17 April 2015

Help to find better metaphors

bb's blog | 07:31, Friday, 17 April 2015

A big problem of KDE Activities is their name. It builds up a poor mental model and thus makes life hard for users. With this post we ask you to help us finding a better name for the underlying concepts. When I talk to people about our quest to make KDE Activities work, one of [...]

Thursday, 26 March 2015

WebRTC: DruCall in Google Summer of Code 2015? - fsfe | 21:58, Thursday, 26 March 2015

I've offered to help mentor a Google Summer of Code student to work on DruCall. Here is a link to the project details.

The original DruCall was based on SIPml5 and released in 2013 as a proof-of-concept.

It was later adapted to use JSCommunicator as the webphone implementation. JSCommunicator itself was updated by another GSoC student, Juliana Louback, in 2014.

It would be great to take DruCall further in 2015, here are some of the possibilities that are achievable in GSoC:

  • Updating it for Drupal 8
  • Support for logged-in users (currently it just makes anonymous calls, like a phone box)
  • Support for relaying shopping cart or other session cookie details to the call center operative who accepts the call

Help needed: could you be a co-mentor?

My background is in real-time and server-side infrastructure and I'm providing all the WebRTC SIP infrastructure that the student may need. However, for the project to have the most impact, it would also be helpful to have some input from a second mentor who knows about UI design, the Drupal way of doing things and maybe some Drupal 8 experience. Please contact me ASAP if you would be keen to participate either as a mentor or as a student. The deadline for student applications is just hours away but there is still more time for potential co-mentors to join in.

WebRTC at mini-DebConf Lyon in April

The next mini-DebConf takes place in Lyon, France on April 11 and 12. On the Saturday morning, there will be a brief WebRTC demo and there will be other opportunities to demo or test it and ask questions throughout the day. If you are interested in trying to get WebRTC into your web site, with or without Drupal, please see the RTC Quick Start guide.

Friday, 17 April 2015

Libreoffice Design Session: Special Character

bb's blog | 07:31, Friday, 17 April 2015

The Libreoffice UX team presents two proposals for an improved dialog to insert special characters. While the first option was designed with a good balance between effort and benefit in mind, the second solution would be really awesome. The Libreoffice UX team discussed possible improvements for the dialog to insert special characters, in particular the [...]

Thursday, 26 March 2015

“Utmost transparency”, Free Software, and disintermediating the lobby business

Karsten on Free Software | 10:35, Thursday, 26 March 2015

Just how transparent does the European Parliament have to be?

In its own rules of procedure, the Parliament has set itself the high standard of conducting its affairs in “utmost transparency”. But what does this mean in practice?

For five years running, the Green group in the European Parliament has celebrated Document Freedom Day together with us. The focus of  yesterday’s event was a recent study titled “Ensuring utmost transparency – Free Software and Open Standards under the Rules of Procedure of the European Parliament”. I recommend that you get the PDF and read it for yourself — it’s well worth your time.

Thanks to MEP Max Andersson, his assistants, and the always wonderful Erik Josefsson, we had a great panel lined up. With Professour Douwe Korff and lawyer Carlo Piana, two of the study’s authors were present to run us through the findings. The study brought many important results.

Why Open Standards and Free Software are essential for transparency

It points out that “utmost transparency” isn’t the same as making information available on request. Requests for access to documents (or Freedom of Information requests, as they’re called elsewhere) belong to the traditional approach, where information is secret by default, and you might get an exception if you ask really nicely.

This might have been acceptable in a pre-digital era, where information largely lived on paper, and gathering, storing and publishing it was expensive and difficult. Today, much of this work can be automated. It’s no longer acceptable to reveal information only when someone happens along to ask for an item in precisely the right way. Information about making policies and laws needs to be public by default.

The study argues that Open Standards are necessary for the Parliament to achieve its goal of “utmost transparency”. The lawmaking process is only really transparent if it can be analysed and reviewed by anyone, on any software platform, without having to ask anyone for permission. This is something that only Open Standards can deliver.

The authors highlight that transparency isn’t a state; it’s an ongoing process. In order to continuosly deliver transparency through Open Standards, the Parliament needs to avoid being tied to any particular IT vendors. Instead, it should use Free Software wherever possible.

How the EP does on transparency – inside and outside views

Four panelists were there to discuss the study, and think about how its results might be put into practice.

Giancarlo Villela is the Director of DG ITEC, and thus responsible for the European Parliament’s IT systems. He pointed out that his team’s primary obligation was to keep those systems running, to make sure that the Parliament could do its work; but that they felt equally obliged to guarantee the security of the Parliament’s IT systems, and to make the Parliament’s work accessible to the public.

He highlighted that he wants the Parliament to be “avantgarde” in IT, taking leadership on transparency and openness among the European institutions. For all the things we at FSFE wish the EP’s IT systems would do better, I need to point out that the Parliament is indeed doing better on this front than the European Commission, let alone the Council, which sometimes seems to communicate its work to the public primarily through leaks. Now if the Parliament could only get its live streams of plenary sessions working for Free Software users, and make them easily accessible…

Martine Reicherts is the Director General of the EU´s Office for Official Publications in Luxembourg, and a former European Commissioner for Justice. She talked about her office’s effort to make EU legislation available and searchable in a way that’s useful for specialists. She said that it took her four years to make European law texts available free of charge. The publications office’s current challenge is to make the data searchable: “If you know a search engine that can efficiently handle 1.3 billion triple-store sets, let me know.”

Jonas Smedegaard is a Debian developer. He has worked quite a lot on the practicalities of making the EP’s systems more transparent. He created the DebianParl distribution, a version of Debian GNU/Linux aimed at people working in the Parliament, and has maintained a constructive dialogue with DG ITEC on actually getting the thing working. (“Constructive dialogue” is Brussels lingo for “an ongoing and sometimes lively argument”.) With his experience at the coalface of transparency, he had quite a few suggestions to make as to what the Parliament’s IT systems could be doing better — especially using standard protocols to handle email, rather than Microsoft’s proprietary tools.

Transparency and legitimacy

I was the final speaker on the panel. With most of the practicalities addressed, I took the opportunity to make a few broader points:

  • Transparency is essential for the legitimacy of the European institutions. If the Parliament, the Commission and the Council want fewer people to complain about their lack of legitimacy, then utmost transparency is an excellent way to go.
  • Currently, some of the best transparency tools around the EP are provided by volunteers, for example ParlTrack. This is no way for the EU’s central democratic institution to go about its business. The Parliament should do two things. It should make raw data and metadata about laws, amendments, and its members publicly available in real time. And it should provide some tools to help people make sense of the data. Many people and organisations will still choose some other way to have the data presented to them according to their needs; but the Parliament needs to provide at least a first entry point for the public.
  • Ideally, this sort of transparency will disintermediate today’s lobbying industry. A lot of people in Brussels and elsewhere spend a lot of time simply keeping track of what happens in the Parliament, in the Commission and the Council. Legions of analysts parse an endless stream of decisions, reports, white papers, speeches, and so forth. Lobby firms – and political NGOs like FSFE – base their influence not so much on superior knowledge of their subject matter, but rather on knowing what’s going on, where to look for information, and who to call in order to find out more. Greater transparency, especially through publication of real-time data on policy making, would make it possible for anyone to create tools to analyse this data. This, in turn, would hopefully make it easier for ordinary citizens to understand a given policy process, and to get involved.

Looking at it this way, Villela’s DG ITEC isn’t just running the EP’s IT systems for the Parliament’s own use. These way these systems are built and run determines how much access Europe’s citizens get to the lawmaking process, and how well they can understand it. These systems play an important role in determining the political legitimacy of the European Parliament, and by extension of the other EU institutions.

That’s a large responsibility for an IT department to carry. But there you have it.

Tuesday, 24 March 2015

The easiest way to run your own OpenID provider? - fsfe | 16:57, Tuesday, 24 March 2015

A few years ago, I was looking for a quick and easy way to run OpenID on a small web server.

A range of solutions were available but some appeared to be slightly more demanding than what I would like. For example, one solution required a servlet container such as Tomcat and another one required some manual configuration of Python with Apache.

I came across the SimpleID project. As the name implies, it is simple. It is written in PHP and works with the Apache/PHP environment on just about any Linux web server. It allows you to write your own plugin for a user/password database or just use flat files to get up and running quickly with no database at all.

This seemed like the level of simplicity I was hoping for so I created the Debian package of SimpleID. SimpleID is also available in Ubuntu.

Help needed

Thanks to a contribution from Jean-Michel Nirgal Vourgère, I've just whipped up a 0.8.1-14 package that should fix Apache 2.4 support in jessie. I also cleaned up a documentation bug and the control file URLs.

Nonetheless, it may be helpful to get feedback from other members of the community about the future of this package:

  • Is it considered secure enough?
  • Have other people found it relatively simple to install or was I just lucky when I tried it?
  • Are there other packages that now offer such a simple way to get OpenID for a vanilla Apache/PHP environment?
  • Would anybody else be interested in helping to maintain this package?
  • Would anybody like to see this packaged in other distributions such as Fedora?
  • Is anybody using it for any online community?

Works with HOTP one-time-passwords and LDAP servers

One reason I chose SimpleID is because of dynalogin, the two-factor authentication framework. I wanted a quick and easy way to use OTP with OpenID so I created the SimpleID plugin for dynalogin, also available as a package.

I also created the LDAP backend for SimpleID, that is available as a package too.

Works with Drupal

I tested SimpleID for login to a Drupal account when the OpenID support is enabled in Drupal, it worked seamlessly. I've also tested it with a few public web sites that support OpenID.

Last chance to vote on the date for the Randa Meetings 2015

Mario Fux | 14:01, Tuesday, 24 March 2015

We plan to close the Doodle for the Randa Meetings date selection at the end of this week. So if you plan to participate please vote on the date that best fits you! And keep in mind two things:

  • You might bring your partner or family with you to Randa. We started this last year and people really liked it (and Randa is a nice holiday place in the Alps too – near to the world-known Zermatt).
  • If you see a lot of well-known names on the Doodle don’t think you shouldn’t be part of this. We always want to see new people with fresh energy and inspiring thoughts and ideas in Randa.

So please add yourself as quickly as possible or write me an email (fux AT kde) or ping me on IRC (unormal).


flattr this!

The outcome is unpredictable but your contribution is priceless

Don't Panic » English Planet | 10:32, Tuesday, 24 March 2015

Tomorrow is Document Freedom Day and this is the time when I am happy to see people around the world engaging on a local level to highlight the importance of Open Standards. All of them in their very own way … Continue reading

Monday, 23 March 2015

Fairphone back to the drawing board

Nico Rikken » fsfe | 19:29, Monday, 23 March 2015

Previously I’ve shared my thoughts and concerns on freedom in mobile operating systems. The Fairphone project unfortunately has a bad reputation in this area. Not because they don’t care, but because they failed to deliver on this promise in their first version. Other people involved in open hardware design for mobile devices saw it coming as they’ve been struggling with exactly the same issue for many years already. Especially for them it shouldn’t have been a surprise that a perfectly fine hardware platform would be kept from future firmware updates.

But as in any process of innovation, a new version allows for improvements. And so will a new upcoming version of the Fairphone. For months the Fairphone has featured several lengthy threads discussing alternative, generally more free operating systems. As I tried to state with a lengthy forum post there are multiple interest in strong conflict with each other. So even whether there will be multiple OS flavours or one for all customers is not yet decided. The great news however is that the Fairphone team have taken on this challenge, big time! I’ve had some email conversations with Kees Jongenburger and Joe Mier regarding further plans and options. But more importantly, they went looking for alternatives at the Mobile World Congress. Over the course of the next months the plans will be finalized, so I’d like to encourage anybody with relevant information to contribute to the discussions. Let’s make the next Fairphone far more fair.

Sunday, 22 March 2015

The BBC Micro and the BBC Micro Bit

Paul Boddie's Free Software-related blog » English | 00:44, Sunday, 22 March 2015

At least on certain parts of the Internet as well as in other channels, there has been a degree of excitement about the announcement by the BBC of a computing device called the “Micro Bit“, with the BBC’s plan to give one of these devices to each child starting secondary school, presumably in September 2015, attracting particular attention amongst technology observers and television licence fee-payers alike. Details of the device are a little vague at the moment, but the announcement along with discussions of the role of the corporation and previous initiatives of this nature provides me with an opportunity to look back at the original BBC Microcomputer, evaluate some of the criticisms (and myths) around the associated Computer Literacy Project, and to consider the things that were done right and wrong, with the latter hopefully not about to be repeated in this latest endeavour.

As the public record reveals, at the start of the 1980s, the BBC wanted to engage its audience beyond television programmes describing the growing microcomputer revolution, and it was decided that to do this and to increase computer literacy generally, it would need to be able to demonstrate various concepts and technologies on a platform that would be able to support the range of activities to which computers were being put to use. Naturally, a demanding specification was constructed – clearly, the scope of microcomputing was increasing rapidly, and there was a lot to demonstrate – and various manufacturers were invited to deliver products that could be used as this reference platform. History indicates that a certain amount of acrimony followed – a complete description of which could fill an entire article of its own – but ultimately only Acorn Computers managed to deliver a machine that could do what the corporation was asking for.

An Ambitious Specification

It is worth considering what the BBC Micro was offering in 1981, especially when considering ill-informed criticism of the machine’s specifications by people who either prefer other systems or who felt that participating in the development of such a machine was none of the corporation’s business. The technologies to be showcased by the BBC’s programme-makers and supported by the materials and software developed for the machine included full-colour graphics, multi-channel sound, 80-column text, Viewdata/Teletext, cassette and diskette storage, local area networking, interfacing to printers, joysticks and other input/output devices, as well as to things like robots and user-developed devices. Although it is easy to pick out one or two of these requirements, move forwards a year or two, increase the budget two- or three-fold, or any combination of these things, and to nominate various other computers, there really were few existing systems that could deliver all of the above, at least at an affordable price at the time.

Some microcomputers of the early 1980s
Computer RAM Text Graphics Year Price
Apple II Plus Up to 64K 40 x 25 (upper case only) 280 x 192 (6 colours), 40 x 48 (16 colours) 1979 £1500 or more
Commodore PET 4032/8032 32K 40/80 x 25 Graphics characters (2 colours) 1980 £800 (4032), £1030 (8032) (including monochrome monitor)
Commodore VIC-20 5K 22 x 23 176 x 184 (8 colours) 1980 (1981 outside Japan) £199
IBM PC (Model 5150) 16K up to 256K 40/80 x 25 640 x 200 (2 colours), 320 x 200 (4 colours) 1981 £1736 (including monochrome monitor, presumably with 16K or 64K)
BBC Micro (Model B) 32K 80/40/20 x 32/24, Teletext 640 x 256 (2 colours), 320 x 256 (2/4 colours), 160 x 256 (4/8 colours) 1981 £399 (originally £335)
Research Machines LINK 480Z 64K (expandable to 256K) 40 x 24 (optional 80 x 24) 160 x 72, 80 x 72 (2 colours); expandable to 640 x 192 (2 colours), 320 x 192 (4 colours), 190 x 96 (8 colours or 16 shades) 1981 £818
ZX Spectrum 16K or 48K 32 x 24 256 x 192 (16 colours applied using attributes) 1982 £125 (16K), £175 (48K)
Commodore 64 64K 40 x 25 320 x 200 (16 colours applied using attributes) 1982 £399

Perhaps the closest competitor, already being used in a fairly limited fashion in educational establishments in the UK, was the Commodore PET. However, it is clear that despite the adaptability of that system, its display capabilities were becoming increasingly uncompetitive, and Commodore had chosen to focus on the chipsets that would power the VIC-20 and Commodore 64 instead. (The designer of the PET went on to make the very capable, and understandably more expensive, Victor 9000/Sirius 1.) That Apple products were notoriously expensive and, indeed, the target of Commodore’s aggressive advertising did not seem to prevent them from capturing the US education market from the PET, but they always remained severely uncompetitive in the UK as commentators of the time indeed noted.

Later, the ZX Spectrum and Commodore 64 were released. Technology was progressing rapidly, and in hindsight one might have advocated waiting around until more capable and cheaper products came to market. However, it can be argued that in fulfilling virtually all aspects of the ambitious specification and pricing, it would not be until the release of the Amstrad CPC series in 1984 that a suitable alternative product might have become available. Even then, these Amstrad computers actually benefited from the experience accumulated in the UK computing industry from the introduction of the BBC Micro: they were, if anything, an iteration within the same generation of microcomputers and would even have used the same 6502 CPU as the BBC Micro had it not been for time-to-market pressures and the readily-available expertise with the Zilog Z80 CPU amongst those in the development team. And yet, specific aspects of the specification would still be unfulfilled: the BBC Micro had hardware support for Teletext displays, although it would have been possible to emulate these with a bitmapped display and suitable software.

Arise Sir Clive

Much has been made of the disappointment of Sir Clive Sinclair that his computers were not adopted by the BBC as products to be endorsed and targeted at schools. Sinclair made his name developing products that were competitive on price, often seeking cost-reduction measures to reach attractive pricing levels, but such measures also served to make his products less desirable. If one reads reviews of microcomputers from the early 1980s, many reviewers explicitly mention the quality of the keyboard provided by the computers being reviewed: a “typewriter” keyboard with keys that “travel” appear to be much preferred over the “calculator” keyboards provided by computers like the ZX Spectrum, Oric 1 or Newbury NewBrain, and they appear to be vastly preferred over the “membrane” keyboards employed by the ZX80, ZX81 and Atari 400.

For target audiences in education, business, and in the home, it would have been inconceivable to promote a product with anything less than a “proper” keyboard. Ultimately, the world had to wait until the ZX Spectrum +2 released in 1986 for a Spectrum with such a keyboard, and that occurred only after the product line had been acquired by Amstrad. (One might also consider the ZX Spectrum+ in 1984, but its keyboard was more of a hybrid of the calculator keyboard that had been used before and the “full-travel” keyboards provided by its competitors.)

Some people claim that they owe nothing to the BBC Micro and everything to the ZX Spectrum (or, indeed, the computer they happened to own) for their careers in computing. Certainly, the BBC Micro was an expensive purchase for many people, although contrary to popular assertion it was not any more expensive than the Commodore 64 upon that computer’s introduction in the UK, and for those of us who wanted BBC compatibility at home on a more reasonable budget, the Acorn Electron was really the only other choice. But it would be as childish as the playground tribalism that had everyone insist that their computer was “the best” to insist that the BBC Micro had no influence on computer literacy in general, or on the expectations of what computer systems should provide. Many people who owned a ZX Spectrum freely admit that the BBC Micro coloured their experiences, some even subsequently seeking to buy one or one of its successors and to go on to build a successful software development career.

The Costly IBM PC

Some commentators seem to consider the BBC Micro as having been an unnecessary diversion from the widespread adoption of the IBM PC throughout British society. As was the case everywhere else, the de-facto “industry standard” of the PC architecture and DOS captured much of the business market and gradually invaded the education sector from the top down, although significantly better products existed both before and after its introduction. It is tempting with hindsight to believe that by placing an early bet on the success of the then-new PC architecture, business and education could have somehow benefited from standardising on the PC and DOS applications. And there has always been the persistent misguided belief amongst some people that schools should be training their pupils/students for a narrow version of “the world of work”, as opposed to educating them to be able to deal with all aspects of their lives once their school days are over.

What many people forget or fail to realise is that the early 1980s witnessed rapid technological improvement in microcomputing, that there were many different systems and platforms, some already successful and established (such as CP/M), and others arriving to disrupt ideas of what computing should be like (the Xerox Alto and Star having paved the way for the Apple Lisa and Macintosh, the Atari ST, and so on). It was not clear that the IBM PC would be successful at all: IBM had largely avoided embracing personal computing, and although the system was favourably reviewed and seen as having the potential for success, thanks to IBM’s extensive sales organisation, other giants of the mainframe and minicomputing era such as DEC and HP were pursuing their own personal computing strategies. Moreover, existing personal computers were becoming entrenched in certain markets, and early adopters were building a familiarity with those existing machines that was reflected in publications and materials available at the time.

Despite the technical advantages of the IBM PC over much of the competition at the beginning of the 1980s, it was also substantially more expensive than the mass-market products arriving in significant numbers, aimed at homes, schools and small businesses. With many people remaining intrigued but unconvinced by the need for a personal computer, it would have been impossible for a school to justify spending almost £2000 (probably around £8000 today) on something without proven educational value. Software would also need to be purchased, and the procurement of expensive and potentially non-localised products would have created even more controversy.

Ultimately, the Computer Literacy Project stimulated the production of a wide range of locally-produced products at relatively inexpensive prices, and while there may have been a few years of children learning BBC BASIC instead of one of the variants of BASIC for the IBM PC (before BASIC became a deprecated aspect of DOS-based computing), it is hard to argue that those children missed out on any valuable experience using DOS commands or specific DOS-based products, especially since DOS became a largely forgotten environment itself as technological progress introduced new paradigms and products, making “hard-wired”, product-specific experience obsolete.

The Good and the Bad

Not everything about the BBC Micro and its introduction can be considered unconditionally good. Choices needed to be made to deliver a product that could fulfil the desired specification within certain technological constraints. Some people like to criticise BBC BASIC as being “non-standard”, for example, which neglects the diversity of BASIC dialects that existed at the dawn of the 1980s. Typically, for such people “standard” equates to “Microsoft”, but back then Microsoft BASIC was a number of different things. Commodore famously paid a one-off licence fee to use Microsoft BASIC in its products, but the version for the Commodore 64 was regarded as lacking user-friendly support for graphics primitives and other interesting hardware features. Meanwhile, the MSX range of microcomputers featured Microsoft Extended BASIC which did provide convenient access to hardware features, although the MSX range of computers were not the success at the low end of the market that Microsoft had probably desired to complement its increasing influence at the higher end through the IBM PC. And it is informative in this regard to see just how many variants of Microsoft BASIC were produced, thanks to Microsoft’s widespread licensing of its software.

Nevertheless, the availability of one company’s products do not make a standard, particularly if interoperability between those products is limited. Neither BBC BASIC nor Microsoft BASIC can be regarded as anything other than de-facto standards in their own territories, and it is nonsensical to regard one as non-standard when the other has largely the same characteristics as a proprietary product in widespread use, even if it was licensed to others, as indeed both Microsoft BASIC and BBC BASIC were. Genuine attempts to standardise BASIC did indeed exist, notably BASICODE, which was used in the distribution of programs via public radio broadcasts. One suspects that people making casual remarks about standard and non-standard things remain unaware of such initiatives. Meanwhile, Acorn did deliver implementations of other standards-driven programming languages such as COMAL, Pascal, Logo, Lisp and Forth, largely adhering to any standards subject to the limitations of the hardware.

However, what undermined the BBC Micro and Acorn’s related initiatives over time was the control that they as a single vendor had over the platform and its technologies. At the time, a “winner takes all” mentality prevailed: Commodore under Jack Tramiel had declared a “price war” on other vendors and had caused difficulties for new and established manufacturers alike, with Atari eventually being sold to Tramiel (who had resigned from Commodore) by Warner Communications, but many companies disappeared or were absorbed by others before half of the decade had passed. Indeed, Acorn, who had released the Electron to compete with Sinclair Research at the lower end of the market, and who had been developing product lines to compete in the business sector, experienced financial difficulties and was ultimately taken over by Olivetti; Sinclair, meanwhile, experienced similar difficulties and was acquired by Amstrad. In such a climate, ideas of collaboration seemed far from everybody’s minds.

Since then, the protagonists of the era have been able to reflect on such matters, Acorn co-founder Hermann Hauser admitting that it may have been better to license Acorn’s Econet local area networking technology to interested competitors like Commodore. Although the sentiments might have something to do with revenues and influence – it was at Acorn that the ARM processor was developed, sowing the seeds of a successful licensing business today – the rest of us may well ask what might have happened had the market’s participants of the era cooperated on things like standards and interoperability, helping their customers to protect their investments in technology, and building a bigger “common” market for third-party products. What if they had competed on bringing technological improvements to market without demanding that people abandon their existing purchases (and cause confusion amongst their users) just because those people happened to already be using products from a different vendor? It is interesting to see the range of available BBC BASIC implementations and to consider a world where such building blocks could have been adopted by different manufacturers, providing a common, heterogeneous platform built on cooperation and standards, not the imposition of a single hardware or software platform.

But That Was Then

Back then, as Richard Stallman points out, proprietary software was the norm. It would have been even more interesting had the operating systems and the available software for microcomputers been Free Software, but that may have been asking too much at the time. And although computer designs were often shared and published, a tendency to prevent copying of commercial computer designs prevailed, with Acorn and Sinclair both employing proprietary integrated circuits mostly to reduce complexity and increase performance, but partly to obfuscate their hardware designs, too. Thus, it may have been too much to expect something like the BBC Micro to have been open hardware to any degree “back in the day”, although circuit diagrams were published in publicly-available technical documentation.

But we have different expectations now. We expect software to be freely available for inspection, modification and redistribution, knowing that this empowers the end-users and reassures them that the software does what they want it to do, and that they remain in control of their computing environment. Increasingly, we also expect hardware to exhibit the same characteristics, perhaps only accepting that some components are particularly difficult to manufacture and that there are physical and economic restrictions on how readily we may practise the modification and redistribution of a particular device. Crucially, we demand control over the software and hardware we use, and we reject attempts to prevent us from exercising that control.

The big lesson to be learned from the early 1980s, to be considered now in the mid-2010s, is not how to avoid upsetting a popular (but ultimately doomed) participant in the computing industry, as some commentators might have everybody believe. It is to avoid developing proprietary solutions that favour specific organisations and that, despite the general benefits of increased access to technology, ultimately disempower the end-user. And in this era of readily available Free Software and open hardware platforms, the lesson to be learned is to strengthen such existing platforms and to work with them, letting those products and solutions participate and interoperate with the newly-introduced initiative in question.

The BBC Micro was a product of its time and its development was very necessary to fill an educational need. Contrary to the laziest of reports, the Micro Bit plays a different role as an accessory rather than as a complete home computer, at least if we may interpret the apparent intentions of its creators. But as a product of this era, our expectations for the Micro Bit are greater: we expect transparency and interoperability, the ability to make our own (just as one can with the Arduino, as long as one does not seek to call it an Arduino without asking permission from the trademark owner), and the ability to control exactly how it works. Whether there is a need to develop a completely new hardware solution remains an unanswered question, but we may assert that should it be necessary, such a solution should be made available as open hardware to the maximum possible extent. And of course, the software controlling it should be Free Software.

As we edge gradually closer to September and the big deployment, it will be interesting to assess how the device and the associated initiative measures up to our expectations. Let us hope that the right lessons from the days of the BBC Micro have indeed been learned!

Friday, 20 March 2015

EvQueue, free software job scheduler and queueing engine, liberated!

Nicolas Jean's FSFE blog » English | 10:41, Friday, 20 March 2015

EvQueue is a free software task scheduler and queueing engine. It handles the planning of simple tasks but also that of workflows, chaining basic pieces of code to more complex endeavours.

We’ve been working on it in my NGO* for around three years, and been thinking about liberating it for half of that time. Now it’s finally available for everybody to enjoy, although it remains more of a web/server/admin/dev thing. We’ve been using it in a production environment since the beginning; to date, more than four millions workflows have been executed.

It has proven very useful for our websites and web applications to allow for background tasks. When a user wants for example to generate personalised snail mail for thousands of people (accordingly big pdf file), or upload a bunch of photos that need to go through some treatments, the operation can get lengthy. Far too lengthy (10s of seconds, minutes) for a page reload! Launch an evQueue workflow that deals with the export or upload, and it’ll run in the background. The user can go on with his or her navigation, and the website can silently check on evQueue to know when the workflow is finished, then inform the user in whichever way.

Documentation on how to install and use evQueue, as well as workflow examples, are available on the evQueue website.

* Que Choisir is a French consumer-protection organisation, we at the IT department work on its websites and many internal web applications.

flattr this!

Unexpected turn at panel discussion on software patents and Free Software

I love it here » English | 10:00, Friday, 20 March 2015

On Monday 17 March 2015, I participated in a panel discussion organised by the European Patent Office at the Cebit in Hannover. The title of the discussion was “Patents, Standards, and Open Source — a changing landscape”. I prepared to discuss software patents, but something unexpected happened in the panel discussion.

I was invited by Grant Philpott (Principal Director of ICT area in the European Patent Office) to participate in the panel discussion. Beside him as a moderated there were: the following participants: Brian Hinman (Senior Vice President and Chief IP Officer, Royal Philips), Koen Lievens (Director DG1, European Patent Office), and myself.

To prepare I first read the EPO’s position on software patents again, and then prepared for the discussion together with our current interns Marius Jammes, Miks Upenieks, and Nicola Feltrin. So they had to read some articles — including one of my favourites “The Most Important Software Innovations” by David Wheeler — and we discussed the main arguments in favour and against software patents again. That was a good practice for them, as well as for me. After this we were well prepared to discuss details about software patents.

Before the event, Brian Hinman and myself were asked to prepare a short input statement about the “main IP needs of the ICT sector in the future, how you see these being ideally met, and what will need to change in order to get to that ‘ideal’ situation.” (My notes for this statement are below.) This was the start of the panel discussion.

I was astonished what happened when the audience was included in the discussion: almost all their questions were about Free Software, and almost none about patents. Instead of expected comments like “but how do we give incentives to inventors” or “but we have to secure investments”, people were interested in Free Software specifics. From the 45 minutes on the panel we at least spoke 25 minutes exclusively about Free Software business models, compliance issues, copyright management, and why Free Software is important for our society and the economy. Afterwards I spent over an hour to answer several questions from the audience which we could not cover during the disucssion.

So this discussion took a completely unexpected turn for me. But in this case I was very happy about that.

My introduction statement

Today Free Software runs on the majority of computers around the world: from supercomputers and other servers, to robots or space shuttles, to computers we carry around every day in like phones or tablets, to very small computers we often do not recognise as such.

How did we achieve it, that nowadays the most important operating system is Free Software, every company uses Free Software, and that it is almost impossible to develop other software without using Free Software yourself?

We achieved that because Free Software empowers people rather than restricting them. Based on copyright we use licenses which grant everybody the rights to use, study, share and improve software for any purpose.

  • The right to use it for any purpose, garantees that everybody can participate in using and developing software. So there is no discrimination on who can use the technology or for what you can use it.
  • Every Free Software license grants you the right to study how it works: In a world which is as complex as ours we cannot afford to keep things secret if we want to solve problems. Source code plus documentation is the best way to share the knowledge how IT devices work. Publishing source code is also the best way to enable interoperability and therefore competition.
  • To adopt software to your own needs it is crucial that you are allowed to improve it. Technology should do what you want to do with it, not what others thought it should do. So you are allowed to modify all parts of the software, use only parts of it, experiment with it, and combine programs to create new products.
  • Furthermore you have the right to share knowledge and workload with others. We have many problems in the world, which can be solved with software. But we have few people who can actually solve them in a good way. Let us enable them to concentrate on fixing new problems, instead of fixing one which was already solved. So Free Software always allows you to share the software — modified or not — with others.

We guarantee everybody those rights through copyright.


  • Legal issues: too many legal issues around technology. Let people be creative to fix other people’s problems, instead of focusing on problems resulting e.g. out of copyright and patents.
  • Licenses: most FS licenses are much easier to understand than proprietary software licenses. Solution: but still we can make them easier to understand and work with, and have fewer licenses.
  • Patents: problematic to have additional monopolies on principles instead of implementations. Burden to do research what other people already did in a field, the need to negotiate with them, dealing with lawsuits. So stronger clarification that patents on software are not allowed. In case it is not clear if it is software or hardware, patents should not be granted.
  • Secrecy: not publishing the source code and thereby preventing others in society to understand how products work or to make interoperable products. This restriction also continues after the copyright period. Solution: at least publicly financed software (including research) needs to be published under Free Software licenses. This way the results can be integrated in all kind of products. Maybe requirement to depose source code.
  • Restricting hardware platforms: someone else controls what you can install on your computers. Solution: clear right that you are allowed to change software on your computers, and as a company also sell those afterwards.

Thursday, 19 March 2015

GNU/Linux, an iPod and Clementine: becoming friends

pb's blog » en | 23:04, Thursday, 19 March 2015

Yet another friend of mine came to me with the problem that they were stuck with the music collection they currently had on their iPod, because they’ve had to re-install the OS on their computer, and were now afraid to connect it with iTunes, since it might “synchronize” their not-on-the-harddrive-anymore collection and thereby wiping all the music from the iPod.


Since that wasn’t the first time I’ve heard this, I offered that I “might take a look at it” – and try to hook it up with my all-time-favorite music player and operating system: Clementine on GNU/Linux (Xubuntu 12.04 LTS).

I was expecting the iPod to be accessible somehow on GNU/Linux, and expected to have to read some HowTos, etc – but I was amazed: It just worked! ™
I connected the iPod over USB, and it immediately showed up as mass-storage device (it’s labeled “PAN”):

I double-clicked it, and was served with the files on the device. The music collection is stored in a non-human readable way: The folders are labeled “FXX” (XX being a zero-padded number), and the music files had 4-letter names with some hex-IDs.

Hm… Let’s hope the files are tagged properly, and that Clementine can handle them.
So, I started Clementine and selected the option “Devices”. There it was!

Not only could I access (and therefore backup) all files on the iPod, but Clementine can read and even upload songs to it without problems – or any configuration necessary. Just select the tracks you want to upload to the iPod, right-click and select “Copy to device…”.
That actually makes way more sense for handling your music collection on your portable music player.

So not only was I able to easily make a full backup of the data on the iPod, but I could easily manage handling the music collection, with the full convenience of Clementine for browsing, tagging and uploading the files.

That’s yet another situation where Free Software enables users to avoid the unnecessary lock-in by the original vendor.
I’m not suggesting anyone to get an iPod, but some people already have one – and it’s better to enable them to use it with Free Software, than have them throw it away.

It might be one of the few MP3 players that survive more than a few months, but it’s just too “restrictive/defective-by-design” for my taste. My all time favorite portable audio player is the “Sansa Clip+” on Rockbox :)
Once you’ve gone Rockbox, you just don’t want back (Especially as an audio engineer)!

Friday, 17 April 2015

Libreoffice Design Session: CMIS Improvement

bb's blog | 07:31, Friday, 17 April 2015

The Libreoffice UX team presents a proposal for an improved integration of content management interoperability services (CMIS). It was the outcome of the second ‘design session’ that will be conducted regularly. Topic of last week’s Libreoffice design session was the integration of content management interoperability services (CMIS). Here is the outcome of this meeting. [Read [...]

Tuesday, 17 March 2015

Mobile is the Future | 01:42, Tuesday, 17 March 2015

Photo of a smashed mobile phone.

(photo Cory Doctorow, CC-BY-SA)

A few days ago I got an email from Google Wembaster Tools which said no more no less but: ‘Your webpage sucks on mobile devices!’ Well, all right, I’m paraphrasing, but that was the gist of it.

I never really paid that much attention to how my site looks on phones or tables. I’ve made sure it loads and looks, but apart from that never spent much time on the issue. I always thought optimising for a small screen would be a lengthy process. How mistaken I was!

In my defence, when I last looked at the problem, state of mobile browsers were different, but now there are really just two things to do. First of all, add a viewport meta tag, e.g.:

<meta name=viewport
      content="width=device-width, initial-scale=1">

and then use min-width or max-width CSS media queries. Admittedly the second part may take some time, but if you’re layout uses simple markup rather than being TABLE-based, reading the excellent article on A List Apart might turn out to be the most time consuming step.

So if you haven’t already, do take a look at whether your website looks reasonably well on small screens. Apparently mobile is the future, or some such.

The ‘bad’ news is that I’ve dropped endless scroll feature. This is because in narrow layout the sidebar moves to the bottom and having endless scrolling enabled would make it unreachable since it would run away all the time.

Saturday, 14 March 2015

Open Letter to Mr. Cook (Apple Computers)

FSFE Fellowship Vienna » English | 15:24, Saturday, 14 March 2015

Mr. Cook,

I just watched your newest keynote. The best thing you presented was the use of USB-C on new MacBooks. Finally you have decided to use an open standard which can make life easier for all users.

Unfortunately this was the end of all the good news since everything else seems to go in the opposite direction. Worst of all is launching a massive surveillance device like the iWatch without being completely transparent with what is happening inside the device and on your servers. I as a user, want to be in control of my data. But this is a concept you obviously oppose.

As long as you stick to closed source software, DRM, restrictive licences and patent laws to maximise your profits, you heavily contribute to inequality and powerlessness around the globe.

This makes you an absolute no-go as a source of computing devices.

Please don’t only consider your gain in power and profit, but also the effect of your work on our society. Do you really consider disempowerment an ethical contribution that you want to be a part of?

Friday, 17 April 2015

Libreoffice Design Session: Entries at Indexes and Tables

bb's blog | 07:31, Friday, 17 April 2015

The Libreoffice UX team presents a proposal for an improved dialog to tweak the entries of table of contents. It was the outcome of the first ‘design session’ that will be conducted regularly. The Libreoffice UX team started last week with another type of meeting, the design session. The goal is to discuss one issue [...]

Discuss the Future of Activities and Virtual Desktops

bb's blog | 07:31, Friday, 17 April 2015

We ask for your feedback on a future scenario on what Activities and Virtual Desktops could evolve to. It has been quite some time since we asked you to share your experiences with Virtual Desktops and Activities. Meanwhile we have been thinking through the enormous amount of feedback you provided. It was very inspiring. Thanks [...]

Planet Fellowship (en): RSS 2.0 | Atom | FOAF |

  /127.0.0.?  /var/log/fsfe/flx » planet-en  Albrechts Blog  Alessandro at FSFE » English  Alina Mierlus - Building the Freedom » English  André on Free Software » English  Being Fellow #952 of FSFE » English  Bela's Internship Blog  Bernhard's Blog  Bits from the Basement  Björn Schießle's Weblog » English  Blog of Martin Husovec  Blog » English  Bobulate  Brian Gough's Notes  Carlo Piana :: Law is Freedom ::  Ciarán's free software notes  Colors of Noise - Entries tagged planetfsfe  Commons Machinery » FSFE  Communicating freely  Computer Floss  Creative Destruction & Me » FLOSS  Daniel Martí's blog - fsfe  Don't Panic » English Planet  ENOWITTYNAME  Escape to freedom  FSFE Fellowship Vienna » English  Fellowship Interviews  Fellowship News  Frederik Gladhorn (fregl) » FSFE  Free Software & Digital Rights Noosphere  Free Software with a Female touch  Free as LIBRE  Free speech is better than free beer » English  Free, Easy and Others  From Out There  GLOG » Free Software  Gianf:) » free software  Graeme's notes » Page not found  Green Eggs and Ham  Handhelds, Linux and Heroes  Heiki "Repentinus" Ojasild » English  HennR's FSFE blog  Henri Bergius  Hook’s Humble Homepage  I love it here » English  Inductive Bias  Intuitionistically Uncertain » Technology  Jelle Hermsen » English  Jens Lechtenbörger » English  Jonas Öberg  Karsten on Free Software  Leena Simon » » english  Losca  Mario Fux  Mark P. Lindhout’s Flamepit  Martin's notes - English  Matej's blog » FSFE  Max's weblog » English  Myriam's blog  Mäh?  Nice blog  Nico Rikken » fsfe  Nicolas Jean's FSFE blog » English  Paul Boddie's Free Software-related blog » English  Pressreview  Riccardo (ruphy) Iaconelli - blog  Saint's Log  Sam Tuke » Free Software  Sam Tuke's blog  Seravo  The Girl Who Wasn't There » English  The trunk  Thib's Fellowship Blog » fsfe  Think. Innovation. » Blog  Thinking out loud » English  Thomas Koch - free software  Thomas Løcke Being Incoherent  Thoughts in Parentheses » Free Software  Tonnerre Lombard  Torsten's FSFE blog » english  Torsten's Thoughtcrimes» Free Software  Viktor's notes » English  Weblog  Weblog  Weblog  Weblog  Weblog  Weblog  Werner's own blurbs  With/in the FSFE » English  a fellowship ahead  agger's Free Software blog  anna.morris's blog  ayers's blog  bb's blog  blog » English  drdanzs blog » freesoftware  emergency exit  free software blog  freedom bits  gollo's blog » English  hesa's Weblog » Free Software  irl:/dev/blog » fsfe-planet  julia.e.klein's blog  marc0s on Free Software  mkesper's blog » English  nikos.roussos  pb's blog » en  pichel's blog  rieper|blog » en  softmetz' anglophone Free Software blog  stargrave's blog  the_unconventional's blog » English  things i made  tobias_platen's blog  tolld's blog  wkossen's blog  yahuxo's blog