Planet Fellowship (en)
Thursday, 05 December 2013
Being Fellow #952 of FSFE » English | 14:19, Thursday, 05 December 2013
The third trip of our scheduled tour around Frankfurt went to Aschaffenburg (Just added the location to OpenStreetMap). Although one thought behind these excursions was to keep the commute of the Fellows short, only two of the 6 Fellows actually came from or near Aschaffenburg We had two newcomers, one of them from Frankfurt.
After a short introduction of those who didn’t know each other, we chatted about the particularities of Aschaffenburg, public transportation and the Liberario App, gadgets, the history of Palm and Nokia and heard anecdotes from various Chaos Communication Congresses.
FSFE related, we talked about our recent achievements at the Cryptoparties in Frankfurt and forged further plans for the next one. I pointed to the latest newsletter with the timeline of FSFE’s achievements which is a nice resource to refer to when people ask: What is FSFE actually doing?
We talked a little bit about Jacob Appelbaum as the most recent famous Fellow and his quote for the Fellowship page. I forgot to mention that we are still looking for more quotes for the website, but will do this in this posting and on the mailing list
We talked about possible activities for Document Freedom Day 2014 and whether or not it may be combined with the TheyDontWantYou.to campaign. We concluded that this may probably not be the best idea, but we may issue stickers to youths and kids if they happen to show up on the event that still has to be decided on.
Sven brought up the idea to use a shared Twitter account to announce and promote our local activities to a broader audience. I think this may work out very well. But what I personally would really like if we had an easy way to pump our messages to various social networks automatically. Friendica may be an option here, but I don’t want to neglect my other commitments by looking any further into that. Any suggestions that may save me the time?
As it was a long evening, there was most likely more stuff worth mentioning, but I stop here. It may be added to the minutes in the wiki.
Thinking out loud | 13:15, Thursday, 05 December 2013
Depuis que j’ai quitté les éclés pour m’exiler à Berlin, les ados me manquent. L’adolescence est une période folle où tout est intense, l’amour, le désespoir, l’enthousiasme, l’ennui et… la propension à la geekerie.
Un des plus grand lycées de France, le lycée Doisneau à Corbeil-Essonnes, organise tous les mois des sessions d’ouverture sur différents sujets. Nous avons la possibilité de consacrer la session de février à internet. Une heure et demie avec plusieurs classes pour parler collaboration, commons, partage du savoir et liberté. Les programmes officiels ne prévoient que des actions de sensibilisation au droit d’auteur et aux dangers d’internet..
Il me faut donc maintenant trouver un/une intervenant/e pour co-animer la séance. Les “objectifs” que j’ai en tête sont:
- les faire rêver et trancher à la fois avec les discours anxiogènes et le marketing des boites de SaaS, Facebook, Google et leurs petits cousins.
- briser le cliché du geek et faire comprendre que tout le monde a sa place dans les communautés qui s’occupent de technologie
- aborder la technologie et ses utilisations comme des enjeux politiques
Pour une fois, je ne pense pas centrer mon discours sur les questions de surveillance et/ou de censure, mais plutôt sur l’incroyable collaboration permise par internet et la copie à volonté Ex: Wikipedia, logiciel libre, différents projets Arduino… Je compte montrer des réalisations concrètes et motivantes puis expliquer pourquoi elles ne sont possibles que grâce à un réseau ouvert et neutre.
Qu’en pensez-vous ?
Ces prochains jours je vais contacter plusieurs personnes que j’imagine très bien dans ce type de discussion. Au travail pour mettre sur pied une sessions que les ados n’oublieront pas de si tôt !
anna.morris's blog | 11:59, Thursday, 05 December 2013
I don’t like adverts because they distract me, they try to manipulate me, and they are usually ugly or annoying. But this article got me thinking about adds on-line again. It focuses mostly on Add-block Plus (GPL) which now has a “white list” system. Add’s let through the block must firstly reach certain criteria (no animation, not in the way of the rest of the site, not too big etc) but secondly any large companies wishing to get unblocked - must pay a fee.
I have been concerned about this tactic since looking at Ghostery’s (proprietary) white-list, where it is not clear what it is and why folks are on it. Add Block Plus are at least transparent about what they do – so I can learn and blog and discuss with you guys! Also, if we compare their white list scheme to things like “vegan society approval”, “BUAV approved” or “ethical company mark” – it’s probably pretty similar. If fact, if I could could simply “block” the unethical products, those tested on animals, those with meat in, those made in sweatshops or by firms who shoot union officials, from ever being in front of my face, I probably would.
Anyway, the interesting bit for me is near the end of the article, where it says that the rough cost of an “add free internet” each month would be £44 per user, on top of whatever they pay now. So that’s all the news sites, blogs, social media sites etc we visit – all sites that don’t make money in other ways (so web shops not included).
I would like to look at how they came to this figure exactly, but for now, I think I disagree with it. The first point is that there are many many projects, video and blog especially, which make a profit from adds because they can, not because they need to. And hello, I know how long it takes to make a video… but I am pretty sure 99% of these people would still make the video if they knew for a fact they would never see a dime. It just happens that when they start to get some serious views (and not before!) they think – hey, I could make a bit of cash by putting an add in front this. So actually, they are not the people who need the money for their site. The people are Youtube, who take a cut too. Hosting video takes a lot of server power. So does that take a dime of the cost? Did the factor in need or simply takeup of add revenue??
The situation is similar with blogs. There are so many blogs (and forums) which are “monetised” – but I really doubt this just covers the hosting, or that these people are working full time and that the adds are their income… and even if they were… how can this £44 a month argument hold up in the face of projects like Wikipedia? Given for free, paid for by donations, paid for with time and love? On top of which we know where our money is going with Wikipedia: they tell us, they keep accounts.
I am actually quite uncomfortable with the “transaction” involved with on-line adds on small sites. It goes something like “come to my site, click on an add, I get some money, you know nothing about me, it may be fulling my porn-addiction/coke-habit/mad religion… but you will never find out anyway.” This is in some ways even worse than the high-street / ethical certification situation I mentioned earlier. At least we can, to some extent, know how good/bad/ugly a shop or product is. We can make an informed decision about where our money goes. We can’t really know this with privately run internet sites… which isn’t a problem… till we are giving them money. An here-in lies an issue: by clicking on an add, we are making a financial transaction, a small one, an invisible one, sure. But there is a figure that can be put on it.
That changes the way I think about adds. I guess it’s about more than just me versus manipulation and distraction.
Wednesday, 04 December 2013
Thinking out loud | 17:53, Wednesday, 04 December 2013
A week after it, I finally found time to write about our international Document Freedom Day meeting. Last week the DFD core team met in Berlin to discuss next year’s campaign. We have a bit less than four months left, and it’s not much.
We discussed the new aspects of this year’s campaign, the new subcampaign about Open Standards in education, merchandising, advocacy material, new content on the new website (!), our strategy to grow DFD outside of Europe and so on..
- we each chose areas we are responsible for. Contact me if you have questions or suggestions on how to grow DFD in your country; on how to reach more non-technical computer users (ideas to improve explanations on documentfreedom.org are especially welcome); or help with fundraising (with Sam).
- we had the pleasure to have Nermin Canik from Turkey. She explained how she coordinated numerous volunteers, reached the press, and why five new great events happened last year in her country. Thanks again for coming Nermin, your work will inspire ours!
♬ For the future of collaboration, information accessibility and long term archiving, chose open standards ♬
Tuesday, 03 December 2013
Monday, 02 December 2013
Inductive Bias | 20:26, Monday, 02 December 2013
I’ve had a bit of time during the past two weeks - so I migrated the content of http://blog.drost-fromm.de over to a new backend. Before fully switching over some time in the next two weeks I’d greatly appreciate your input on missing functionality, broken encodings or genera weirdness as a result of the switch. Check out the new version at its temporary location over at http://www.drost-fromm.de/idfblog (including any new posts that appear until the switch is complete).
Known issues: The tag cloud went away, so did comments (switched off here due to spamming reasons anyway), the archives moved to a sub-page. Oh, and I defnitely do need to tidy up the tags and categories assigned to posts. But that is a known issue for the current blog as well unfortunately.
Paul Boddie's Free Software-related blog » English | 18:55, Monday, 02 December 2013
Back when I last wrote about the status of the Neo900 initiative, the fundraising had just begun and the target was a relatively modest €25000 by “crowdfunding” standards. That target was soon reached, but it was only ever the initial target: the sum of money required to prototype the device and to demonstrate that the device really could be made and could eventually be sold to interested customers. Thus, to communicate the further objectives of the project, the Neo900 site updated their funding status bar to show further funding objectives that go beyond mere demonstrations of feasibility and that also cover different levels of production.
So what happened here? Well, one of the slightly confusing things was that even though people were donating towards the project’s goals, it was not really possible to consider all of them as potential customers, so if 200 people had donated something (anything from, say, €10 right up to €5000), one could not really rely on them all coming back later to buy a finished device. People committing €100 or more might be considered as a likely purchaser, especially since donations of that size are effectively treated as pledges to buy and qualify for a rebate on a finished device, but people donating less might just be doing so to support the project. Indeed, people donating €100 or more might also only be doing so to support the project, but it is probably reasonable to expect that the more people have given, the more likely they are to want to buy something in the end. And, of course, if someone donates the entire likely cost of a device, a purchase has effectively been made already.
So even though the initiative was able to gauge a certain level of interest, it was not able to do so precisely purely by considering the amount of financial support it had been receiving. Consequently, by measuring donations of €100 or more, a more realistic impression of the scale of eventual production could be obtained. As most people are aware, producing things in sufficient quantity may be the only way that a product can get made: setup costs, minimum orders of components, and other factors mean that small runs of production are prohibitively expensive. With 200 effective pledges to buy, the initiative can move beyond the prototyping phase and at least consider the production phase – when they are ready, of course – without worrying too much that there will be a lack of customers.
Since my last report, media coverage has even extended into the technology mainstream, with Wired even doing a news article about it. Meanwhile, the project itself demonstrated mechanically compatible hardware and the modem hardware they intend to use, also summarising component availability and potential problems with the sourcing of certain components. For the most part, things are looking good indeed, with perhaps the only cloud on the horizon being a component with a 1000-unit minimum order quantity. That is why the project will not be stopping with 200 potential customers: the more people that jump on board, the greater the chances that everyone will be able to get a better configuration for the device.
If this were a mainstream “crowdfunding” effort, they might call that a “stretch goal”, but it is really a consequence of the way manufacturing is done these days, giving us economies of scale on the one hand, but raising the threshold for new entrants and independent efforts on the other. Perhaps we will eventually see innovations in small-scale manufacturing, not just in the widely-hyped 3D printing field, but for everything from electronic circuits to screens and cases, that may help eliminate some of the huge fixed costs and make it possible to design and make complicated devices relatively cheaply.
It will certainly be interesting to see how many more people choose to extend the lifespan of their N900 by signing up, or how many embrace the kind of smartphone that the “fickle market” supposedly does not want any more. Maybe as more people join in, more will be encouraged to join in as well, and so some kind of snowball effect might occur. Certainly, with the transparency shown in the project so far, people will at least be able to make an informed decision about whether they join in or not. And hopefully, we will eventually see some satisfied customers with open hardware running Free Software, good to go for another few years, emphasizing once again that the combination is an essential ingredient in a sustainable technological society.
Don't Panic » english planet | 10:32, Monday, 02 December 2013Ten days ago, on September 27 to 29, FSFE held its first European Coordinators Meeting (#ECM) in Berlin. Therefore, the FSFE invited Fellowship Coordinators from all over Europe to come for a weekend to exchange knowledge and visions between FSFE staff and the Fellowship. In total, we have been 22 people from 10 different countries Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013Today is the day to show your love for Free Software. Here is my message: I love Free Software because it is in the very heart of a 21st century society that respects the rules of privacy, autonomy, democracy, participation and the freedom of speech. (Just to list a few of its countless good characteristics) Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013Today, the new posters for Document Freedom Day 2013 arrived. Thank you, Stepan @stehno Stehlicek – FSFE’s new intern – for your wonderful presentation:
Don't Panic » english planet | 10:32, Monday, 02 December 2013… but they don’t want you to! This is the name of a new campaign by the FSFE that will be launched soon. To not miss it, follow FSFE’s news section. In the meantime, keep an eye on the streets – you will maybe see some stickers where you don’t expect them.
Don't Panic » english planet | 10:32, Monday, 02 December 2013Last weekend, the German Pirate Party held its federal party convent to discuss and potentially agree on various amendments to their manifesto. Among them, there was amendment number 551: “Freie Softwareinstallation statt App-Store-Zwang” (German). A proposal, that aims at giving every user the right to install whatever software he likes on any computer-like device – Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013Last week, together with Torsten @grote Grote, I attended SFScon to give a FreeYourAndroid workshop. Thanks to Patrick @ohnewein Ohnewein and Shaun @shaunschutte Schutte, we had the chance to use two Nexus 7 for every purpose. We decided to install Cyanogenmod on one and Ubuntu on the other. Installing Ubuntu was my turn and I Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013Last weekend I was attending FSCONS, where I was giving a FreeYourAndroid-workshop as well as my first talk about FSFE’s FreeYourAndroid campaign. I have never been to FSCONS before and I was quite surprised by its familiar atmosphere. Some special fun came up with the Karaoke event at Saturday night, where in the end nearly Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013If you know the Free Software Foundation Europe and you like what we are doing, there are a lot of ways and possibilities, how to support our work. You might join a local Fellowship group, subsribe to the newsletter, become a Fellow, [...] or – newly – become a supporter. What does that mean – Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013About digital restrictions Today, May 3rd 2013, is the international day against Digital Restrictions Management, powered by the Free Software Foundation. Usually, the term Digital Restrictions Management (DRM) refers to various restrictions that companies – or any other content provider – impose on digital media and data. These restrictions are there to let providers decide Read more »
Don't Panic » english planet | 10:32, Monday, 02 December 2013Recently, I made a blogpost about the ownership of your own device and how control of technology is directly linked with the freedom of society – as well as with the freedom of each individual. The argument made in that post was, that remote control of technology in the hands of manufacturers put users out Read more »
Saturday, 30 November 2013
Colors of Noise - Entries tagged planetfsfe | 12:54, Saturday, 30 November 2013
Following up on my port of the crystalhd plugin to the gstreamer 1.0 api I realized that the CrystalHD repo is pretty dormant. After reading slomo's nice article about GStreamer and hardware integration and a short off list mail exchange I decided to split the GStreamer part out of the CrystalHD repo and to try to get the plugin into gst-plugins-bad.
Since the kernel part is already in linux kernel's staging area there would not be much left in the repo except for the libcrystalhd library itself and the firmware blobs. So I split them out as well and started to clean them up a bit by moving it to autoconf/automake, dropping the need for a C++ compiler and adding symbol versioning among other things.
So up to know video is still smooth with:
gst-launch-1.0 filesrc location=sample.mp4 ! decodebin ! xvimagesink
after jhbuilding up to gst-plugins-bad.
There are #ifdefs for macosx and windows but I doubt they're functional but in case anybody is building libcrystalhd on these these platforms it'd be great to know if it still works.
Should these efforts lead to the crystalhd plugin being merged into GStreamer getting the kernel driver out of staging would be a great next step.
This blog is flattr enabled.
Thursday, 28 November 2013
Seravo | 13:38, Thursday, 28 November 2013
The first device running SailfishOS, the successor of Meego, has finally been released. It’s elegant and beautiful both on the outside and inside. It has multiple unique features that makes it unlike any mobile device we’ve seen so far.
We have been waiting for Jolla to release their phone for more than a year and finally it has happened. It is certainly not an easy task to make the world’s greatest mobile device and fulfill all the expectations people have for Jolla, but they have indeed succeeded in doing something amazing. The last Nokia Meego device N9 was very good and appraised for its gesture based interface and now two years later we can find that all new Google apps, Windows Metro and Ubuntu phone among others are built around swiping. In Jolla SailfishOS the gesture based interface is refined and feels almost magical to use.
The device rocks an clean and elegant Nordic design. It is simply beautiful. There are no front facing buttons but on the side there is a power key and volume buttons. The volume buttons double as camera buttons if the camera is open. The back facing camera has an LED flash, auto-focus and an 8 megapixel sensor which is more than enough. In fact, the default wide screen camera mode takes 6,1 megapixel pictures, which many agree is the most optimal file size-to-quality ratio. There is also a smaller front facing camera for video call use. The internal storage is 16 GB and there is a slot for an external microSD card where users can attach e.g. a 64 GB card. The screen resolution is 960×540 pixels which looks very sharp on the beautiful 4,5 inch screen – no pixels can be distinguished with the bare eye. All the usual sensors are included (compass, gyrometer, acceleration, ambient light). The GPS is a dual chip with Glonass, so it will be able to figure out the location quickly – even inside vehicles or indoors. The battery is user-replaceable and has a capacity of 2100 mAh.
All of the above can be found in other top-of-line smart phones as well. However, there is one hardware feature that is unique to Jolla: The Other Half. The Other Half is a concept of user-changeable smart back cover. The basic covers included in the package feature an embedded NFC chip that makes the Sailfish UI change color and theme (called Ambience). However, the Other Half could connect to the main device also using a I2C connector. I2C is a bus standard common in all sorts of electronic devices and it can transmit both power and communication. Using this bus, anybody could make all kinds of imaginative Other Halves. Hopefully a keyboard will be one of the first Other Halves to come to the market. The battery connectors are also facing the back cover, so it should be easy for any manufacturer to make a giant 10 000 mhA battery filled Other Half. And of course the Other Half could also use Bluetooth or other generic means to communicate with the main device. The official specs will soon be released, so even home 3D-printing enthusiasts may produce their own Other Halves. It will be very exciting to see what kind of Other Halves start to appear in the future.
The software: SailfishOS
The software is indeed something altogether unique. Most readers of this blog are familiar with the story of Maemo-Meego-Eflop. Now the big question is, does SailfishOS offer something it’s competition does not? – Yes! The swipe based UI is a bit weird the first 5 minutes you use it, but once your muscle memory catches up, you’ll notice your fingers swiping all devices you touch and your brain wondering why those other devices require unnecessary amounts of thought to use. Android is easy to use, but even after using SailfishOS just for one day, going back to the Android world of multiple desktops, widgets, app menus and such starts to look rather complex. SailfishOS is just so natural you need to experience it yourself.
Visualise this: you take your device out of your pocket and then double tap on the screen with your thumb to activate it. Then, without moving the place your thumb is at you just swipe down a little bit and feel the device vibrate three times as the selection passes over the pull down menu options. Even without looking at the device you know when the third option (camera) is selected. Then you simply lift your thumb and the camera opens. Then you point at something, move your thumb slightly down and then touch to take the picture. Then you might want to look at the picture you just took? Just swipe left and the picture is visible in full screen. Pinch to zoom. When you are done and you e.g. want to look at the clock, just swipe left starting from the edge of the screen and you get to see the main view with time, battery status, open apps overview etc. Maybe in the middle of that you decide you still have time to capture some more photos. Instead of swiping all the way to the end, you reverse direction in the middle and return back to the camera. All this with very smooth animations and responsive feeling.
On day-to-day basis a very important part of the user experience is the on-screen keyboard. SailfishOS uses the same Maliit keyboard familiar to many from the N9, which is excellent. It is strange that this open source keyboard component hasn’t been picked up in other mobile Linux “distributions”.
Jolla themselves talk a lot about the UI theme system called Ambiance. Basically you pick a background picture and then ambiance applies automatically the picture and matching colors all over the user interface. Most of the time it is stylish, but sometimes the system thinks bright red is a good colour for UI elements and you wish you could set the colours manually; but you can’t.
Multitasking is a feature that sounds technical, but when you have the app overview open and see the real time minified versions of the apps (think Gnome Shell) the feature and its advantages are easy to understand. It was great to be able to have Osmand download a 150 MB offline map file uninterrupted while browsing some photos at the same time. It was great to be able to load up a music video on YouTube and be able to listen to it without having the screen turned on at all or while reading an e-mail in another app.
The settings center is fantastic. In fact, the whole way the app settings, contacts, accounts and everything in general seems to be integrated very well; but then again that was already true with Nokia N900. This is an area where the Maemo-Meego was a pioneer and the competition hasn’t catched up yet. Linux geeks will love that in the settings there is an option to enable the developer mode. With developer mode enabled, the app Terminal is visible. This app is in fact the famous FingerTerm app with a special four row keyboard with all the special keys needed in terminal use. The keyboard sits translucent on top of the content, so the maximal screen area is available for terminal output and the app works well in both landscape and portrait mode. Linux geeks and developers will also respect the fact that SailfishOS is a true GNU/Linux system running Linux kernel 3.4 with a fully functional shell, software is managed as RPM packages with Zypper and the SailfishOS project is very open to new contributors. If you want to read about how this software is shipped, checkout the SailfishOS site and upstream projects Mer and Nemo.
The SailfishOS label says ‘beta’ but the system itself seems mature and stable, at least in our use so far. All the functionality that belongs to a modern mobile OS is there, but when it comes to things apps should do, there are still many things lacking.
The software: HTML5, Jolla and Android apps
Jolla has its own app store where you can browse and install native Qt/QML-based apps. At the moment however, there are very few apps, but then again all the basic apps you need like an alarm clock, e-mail, maps, calendar etc. are included.
At Seravo we hold the belief that in the long term, browser based HTML5 apps will be more important than native apps. Therefore to us the mobile browser is more important than any of the native apps combined. It should be is easy to search the web, enter URLs, open new tabs, save bookmarks etc. All of this can be done with the current browser in SailfishOS. The browser seems to work pretty fast and flawlessly. We heard at the launch event that the rendering engine is Gecko (same as in Firefox) so it is likely to have good support for HTML5 features. However in terms of usability, Chrome for Android is still the best mobile browser we’ve used so far. In particular the SailfishOS browser does not seem to have support for landscape mode. Hopefully while SailfishOS matures, the browser will grow to be more polished as well.
Android apps can be run using Alien Dalvik (probably some sort of virtual machine layer). You can either get both free and paid apps from the bundled Yandex store or you can get individual .apk files from other sources and install them manually. If you prefer open source only, then one option would be to use F-Droid, which can be installed by simply opening in the browser f-droid.org, downloading the apk and activating it.
Interesting times ahead
On the SailfishOS website it states that “We believe this will act as a refreshing sea-wind that will help push the industry forward.” Indeed it is refreshing and is leads the industry forward by leaps and bounds. At the same time there are also interesting developments going on with FirefoxOS, Ubuntu and Tizen. Whenever these are compared, Jolla seems to get the best reviews. However, SailfishOS and particularly the ecosystem around it has just started to grow so nothing is certain yet. The only thing we can be sure of is that we live in very interesting times. In the coming years, billions of people are going to buy new mobile phones and to many of them, their mobile phone is going to be their primary device to get online and get involved in the information society. If you wish to become part of this, you can get involved in the SailfishOS community. SailfishOS can, in fact, be installed on other devices as well as the Jolla phone, but nevertheless we recommend, in particular to Linux fans, to scroll down at Jolla.com and sign up for availablility notifications so you can eventually get a Jolla phone for yourself.
Henri Bergius | 08:00, Thursday, 28 November 2013
BitCoin — the decentralized digital currency — has been making a lot of headlines lately. Much of this is driven by the current investment boom around it that has raised the exchange rates over the 1000 USD mark. But really, looking at BitCoin as a medium for currency or asset speculation is a bad idea. Instead, we should see it purely as a medium of exchange.
Not a great holder of value
Much of the press stories about BitCoin see it as a speculative investment free of currency manipulation by central banks. By its nature, it is a deflationary currency, given that the global supply of BitCoins is capped. As long as there is interest in BitCoin, it means the prices are bound to go up. But at the same time, this is a threat to the currency. Deflation disincentivizes people from spending their money, as tomorrow the value is likely to be higher.
This is a big difference to fiat currencies where central banks generally act to ensure that there is a little bit of inflation, giving people incentives to spend, invest, and keep the economy running.
Traditional currencies are given a value by the fact that there is a government requiring taxes to be paid in them. This creates a demand in the currency, as people have to acquire units of that currency somehow, usually by selling goods and services. So, the value of a currency is built on the power of a state to tax its subjects in that currency.
The other traditional holder of value, gold, also differs from BitCoins in the sense that at least you can make some useful things out of it.
In contrast, the value of BitCoin without an economy backing it up is exactly zero. Without an active network of miners, and business transactions that create a demand for them, there is not much you could do with a BitCoin.
Given this setting, BitCoin as a investment mechanism resembles a ponzi scheme where people make speculative investments in them in hopes of being able to pass them on to the next fool when the prices go up.
This is obviously not sustainable in the long term. But luckily there are things you can do with BitCoin that still make them interesting if you deal with them as a medium of exchange and not a long-term holder of value.
Great tool for conducting transactions
Now that we've established that BitCoins are not the best thing to put your life savings into, or to stash under your mattress, what is it exactly that makes them interesting?
BitCoin is a way of making transactions quickly, safely, and without middlemen. A lot of the articles written about it compare it to gold, but in my view a much closer comparison is M-Pesa, the operator-driven mobile payment network that powers over 30% of the Kenyan economy.
M-Pesa on a phone, by Pixel Ballads
When you see people paying with their mobile phones in a African street market, you know they've leapfrogged us in technology.
BitCoin has the exactly same advantages as M-Pesa: you can send it to anywhere with minimal costs and delays, you can keep it on your phone, and anybody can ask for a payment through it. Unlike M-Pesa, BitCoin is truly decentralized and doesn't rely on a mobile operator to act as the "central bank".
Unlike M-Pesa, BitCoin currently requires an internet-connected smartphone with a wallet application. This limits its usefulness in the world outside of the rich countries (though cheap Android phones are changing the situation). It would be a huge boost for mobile commerce with BitCoins if there was a service that enabled paying and requesting BitCoin over SMS.
The one to make BitCoin work over SMS will be rich
Last summer I was talking with Joerg Patzer, the owner of Room 77, a restaurant in Berlin that has been one of the early adopters of BitCoin. At the time about 10% of the his sales happened in BitCoin, and the number has been constantly growing since. He told that he even pays some of his suppliers in BitCoin, and only converts some of it to euros, solely because you can't pay taxes in BitCoin.
Patzer buys the beer for Room 77 from the nearby Rollberg brewery, owned and run by qualified brew and malt meister Wilko Bereit. He pays for the barrels with Bitcoins
Joerg also told me an interesting case where there was a group of tourists having a dinner in his restaurant. A friend of theirs that wasn't able to make the trip insisted on paying the bill remotely from Israel. This is something that would be hard to do with the traditional credit cards or currencies. Paying in BitCoin is quite straightforward:
He taps the amount he owes Room 77 into the virtual Bitcoin wallet on his Android phone and, aligning it with a code on the bar's device, presses a button to process the payment. A theatrical "kerching" sound follows and Gallas is grinning from ear to ear. "It could hardly be easier," he insisted.
Requesting a payment using BitCoin Wallet for Android
Global currency for the internet
Requesting BitCoins is as easy as providing a URL. This could be presented as a link on the web, or shared to mobile devices via QR Codes. For example, clicking this link on a device with a BitCoin wallet would enable you to send me BitCoins worth around 0.1 EUR (on today's exchange rates). No need for dealing with payment gateways or complicated protocols.
Ease of paying, no middle men, minimal transaction fees, and no risk of chargebacks make a lot of interesting business scenarios possible online. Consider for example micropayments for online content. This is something that would make online publishing viable without the current ad-driven models, but has been held back by the risks and costs associated with credit card payments. Wider adoption of BitCoin would suddenly make this a very realistic way of asking for money for the things you do online.
Other clever uses
Since BitCoins can be exchanged in miniscule amounts, and there is a public record kept on each transaction, they enable all kinds of new use cases.
One such service that I ran into yesterday is Proof of Existence, a tool for storing the fact that you were in possession of a document at a given time, and that it hasn't been revised since based on microtransactions stored in the BitCoin block chain. This is a mechanism known as trusted timestamping that could be useful for things we currently utilize notaries for. The Hacker News thread on this listed some interesting use cases, like verifying scientific priority on new inventions, i.e. that you had written a document describing a new invention at a given time even if you publish it only later.
We could also use BitCoin for solving the spam problem. Emails could be "stamped" with a BitCoin microtransaction, and email servers could be configured to reject messages that don't have a valid payment attached to them. This would enable bringing decentralization back to the internet as we could again use distributed systems like email and XMPP without having to resort to the walled gardens like Facebook to keep them spammers away.
Another interesting approach was to use BitCoin transactions for making bets.
How to get started
A good way to get started with BitCoin is to read Wired's BitCoin survival guide or the materials available on BitCoins.com. Install a BitCoin wallet application on your mobile device. Then get some BitCoin either by buying locally, from an online exchange or by selling goods or services for BitCoin.
This is all you need to get going. Now you just need to find a way to spend them, for example in online shops, by buying domains, or in shops and restaurants that accept them, like many do in my neighborhood.
Disclaimer: I'm a programmer, not an economist. There are better people to take investment advice from. I do hold a little bit of BitCoin, giving me some vested interest in the matter. Not the kind of amount that would make it an investment, but just the right amount for being able to buy a few nice dinners, especially at the Berlin prices. Generally, I try to keep the amount I have in BTC low enough to be equivalent of what I might be comfortable with carrying in cash.
Losca | 07:36, Thursday, 28 November 2013And then for something completely different, I've my hands on Jolla now, and it's beautiful!
A quick dmesg of course is among first things to do...
And what it has eaten: Qt 5.1![ 0.000000] Booting Linux on physical CPU 0
[ 0.000000] Initializing cgroup subsys cpu
[ 0.000000] Linux version 220.127.116.1131115.2 (abuild@es-17-21) (gcc version 4.6.4 20130412 (Mer 4.6.4-1) (Linaro GCC 4.6-2013.05) ) #1 SMP PREEMPT Mon Nov 18 03:00:49 UTC 2013
[ 0.000000] CPU: ARMv7 Processor [511f04d4] revision 4 (ARMv7), cr=10c5387d
[ 0.000000] CPU: PIPT / VIPT nonaliasing data cache, PIPT instruction cache
[ 0.000000] Machine: QCT MSM8930 CDP
... click for the complete file ...
...It was a very nice launch party, thanks to everyone involved.qt5-qtconcurrent-5.1.0+git27-1.9.4.armv7hl
... click for the complete file ...
Update: a few more at my Google+ Jolla launch party gallery
Wednesday, 27 November 2013
Paul Boddie's Free Software-related blog » English | 13:52, Wednesday, 27 November 2013
I have had reason to consider the way organisations make technology choices in recent months, particularly where the public sector is concerned, and although my conclusions may not come as a surprise to some people, I think they sum up fairly well how bad decisions get made even if the intentions behind them are supposedly good ones. Approaching such matters from a technological point of view, being informed about things like interoperability, systems diversity, the way people adopt and use technology, and the details of how various technologies work, it can be easy to forget that decisions around acquisitions and strategies are often taken by people who have no appreciation of such things and no time or inclination to consider them either: as far as decision makers are concerned, such things are mere details that obscure the dramatic solution that shows them off as dynamic leaders getting things done.
Assuming the Position
So, assume for a moment that you are a decision-maker with decisions to make about technology, that you have in your organisation some problems that may or may not have technology as their root cause, and that because you claim to listen to what people in your organisation have to say about their workplace, you feel that clear and decisive measures are required to solve some of those problems. First of all, it is important to make sure that when people complain about something, they are not mixing that thing up with something else that really makes their life awkward, but let us assume that you and your advisers are aware of that issue and are good at getting to the heart of the real problem, whatever that may be. Next, people may ask for all sorts of things that they want but do not actually need – “an iPad in every meeting room, elevator and lavatory cubicle!” – and even if you also like the sound of such wild ideas, you also need to be able to restrain yourself and to acknowledge that it would simply be imprudent to indulge every whim of the workforce (or your own). After all, neither they nor you are royalty!
With distractions out of the way, you can now focus on the real problems. But remember: as an executive with no time for detail, the nuances of a supposedly technological problem – things like why people really struggle with some task in their workplace and what technical issues might be contributing to this discomfort – these things are distractions, too. As someone who has to decide a lot of things, you want short and simple summaries and to give short and simple remedies, delegating to other people to flesh out the details and to make things happen. People might try and get you to understand the detail, but you can always delegate the task of entertaining such explanations and representations to other people, naturally telling them not to waste too much time on executing the plan.
On the Wrong Foot
So, let us just consider what we now know (or at least suspect) about the behaviour of someone in an executive position who has an organisation-wide problem to solve. They need to demonstrate leadership, vision and intent, certainly: it is worth remembering that such positions are inherently political, and if there is anything we should all know about politics by now, it is that it is often far more attractive to make one’s mark, define one’s legacy, fulfil one’s vision, reserve one’s place in the history books than it is to just keep things running efficiently and smoothly and to keep people generally satisfied with their lot in life; this principle alone explains why the city of Oslo is so infatuated with prestige projects and wants to host the Winter Olympics in a few years’ time (presumably things like functioning public transport, education, healthcare, even an electoral process that does not almost deliberately disenfranchise entire groups of voters, will all be faultless by then). It is far more exciting being a politician if you can continually announce exciting things, leaving the non-visionary stuff to your staff.
Executives also like to keep things as uncluttered as possible, even if the very nature of a problem is complicated, and at their level in the organisation they want the explanations and the directives to be as simple as possible. Again, this probably explains the “rip it up and start over” mentality that one sees in government, especially after changes in government even if consecutive governments have ideological similarities: it is far better to be seen to be different and bold than to be associated with your discredited predecessors.
But what do these traits lead to? Well, let us return to an organisational problem with complicated technical underpinnings. Naturally, decision-makers at the highest levels will not want to be bored with the complications – at the classic “10000 foot” view, nothing should be allowed to encroach on the elegant clarity of the decision – and even the consideration of those complications may be discouraged amongst those tasked to implement the solution. Such complications may be regarded as a legacy of an untidy and unruly past that was not properly governed or supervised (and are thus mere symptoms of an underlying malaise that must be dealt with), and the need to consider them may draw time and resources away from an “urgently needed” solution that deals with the issue no matter what it takes.
How many times have we been told “not to spend too much time” on something? And yet, that thing may need to be treated thoroughly so that it does not recur over and over again. And as too many people have come to realise or experience, blame very often travels through delegation: people given a task to see through are often deprived of resources to do it adequately, but this will not shield them from recriminations and reprisals afterwards.
It should not demand too much imagination to realise that certain important things will be sacrificed or ignored within such a decision-making framework. Executives will seek simplistic solutions that almost favour an ignorance of the actual problem at hand. Meanwhile, the minions or underlings doing the work may seek to stay as close as possible to the exact word of the directive handed down to them from on high, abandoning any objective assessment of the problem domain, so as to be able to say if or when things go wrong that they were only following the instructions given to them, and that as everything falls to pieces it was the very nature of the vision that led to its demise rather than the work they did, or that they took the initiative to do something “unsanctioned” themselves.
The Magic Single Vendor Temptation
We can already see that an appreciation of the finer points of a problem will be an early casualty in the flawed framework described above, but when pressure also exists to “just do something” and when possible tendencies to “make one’s mark” lie just below the surface, decision-makers also do things like ignore the best advice available to them, choosing instead to just go over the heads of the people they employ to have opinions about matters of technology. Such antics are not uncommon: there must be thousands or even millions of people with the experience of seeing consultants breeze into their workplace and impart opinions about the work being done that are supposedly more accurate, insightful and valuable than the actual experiences of the people paid to do that very work. But sometimes hubris can get the better of the decision-maker to the extent that their own experiences are somehow more valid than those supposed experts on the payroll who cannot seem to make up their minds about something as mundane as which technology to use.
And so, the executive may be tempted to take a page from their own playbook: maybe they used a product in one of their previous organisations that had something to do with the problem area; maybe they know someone in their peer group who has an opinion on the topic; maybe they can also show that they “know about these things” by choosing such a product. And with so many areas of life now effectively remedied by going and buying a product that instantly eradicates any deficiency, need, shortcoming or desire, why would this not work for some organisational problem? “What do you mean ‘network provisioning problems’? I can get the Internet on my phone! Just tell everybody to do that!”
When the tendency to avoid complexity meets the apparent simplicity of consumerism (and of solutions encountered in their final form in the executive’s previous endeavours), the temptation to solve a problem at a single stroke or a single click of the “buy” button becomes great indeed. So what if everyone affected by the decision has different needs? The product will surely meet all those needs: the vendor will make sure of that. And if the vendor cannot deliver, then perhaps those people should reconsider their needs. “I’ve seen this product work perfectly elsewhere. Why do you people have to be so awkward?” After all, the vendor can work magic: the salespeople practically told us so!
The Threat to Diversity
In those courses in my computer science degree that dealt with the implementation of solutions at the organisational level, as opposed to the actual implementation of software, attempts were made to impress upon us students the need to consider the requirements of any given problem domain because any solution that neglects the realities of the problem domain will struggle with acceptance and flirt with failure. Thus, the impatient executive approach involving the single vendor and their magic product that “does it all” and “solves the problem” flirts openly and readily with failure.
Technological diversity within an organisation frequently exists for good reason, not to irritate decision-makers and their helpers, and the larger the organisation the larger the potential diversity to be found. Extrapolating from narrow experiences – insisting that a solution must be good enough for everyone because “it is good enough for my people” – risks neglecting the needs of large sections of an organisation and denying the benefits of diversity within the organisation. In turn, this risks the health of those parts of an organisation whose needs have now been ignored.
But diversity goes beyond what people happen to be using to do their job right now. By maintaining the basis for diversity within an organisation, it remains possible to retain the freedom for people to choose the most appropriate systems and platforms for their work. Conversely, undermining diversity by imposing a single vendor solution on everyone, especially when such solutions also neglect open standards and interoperability, threatens the ability for people to make choices central to their own work, and thus threatens the vitality of that work itself.
Stories abound of people in technical disciplines who “also had to have a Windows computer” to do administrative chores like fill out their expenses, hours, travel claims, and all the peripheral tasks in a workplace, even though they used a functioning workstation or other computer that would have been adequate to perform the same chores within a framework that might actually have upheld interoperability and choice. Who pays for all these extra computers, and who benefits from such redundancy? And when some bright spark in the administration suggests throwing away the “special” workstation, putting administrative chores above the real work, what damage does this do to the working environment, to productivity, and to the capabilities of the organisation?
Moreover, the threat to diversity is more serious than many people presumably understand. Any single vendor solution imposed across an organisation also threatens the independence of the institution when that solution also informs and dictates the terms under which other solutions are acquired and introduced. Any decision-maker who regards their “one product for everybody” solution as adequate in one area may find themselves supporting a “one vendor for everything” policy that infects every aspect of the organisation’s existence, especially if they are deluded enough to think that they getting a “good deal” by buying all their things from that one vendor and thus unquestioningly going along with it all for “economic reasons”. At that point, one has to wonder whether the organisation itself is in control of its own acquisitions, systems or strategies any longer.
Somebody Else’s Problem
People may find it hard to get worked up about the tools and systems their employer uses. Surely, they think, what people have chosen to run a part of the organisation is a matter only for those who work with that specific thing from one day to the next. When other people complain about such matters, it is easy to marginalise them and to accuse them of making trouble for the sake of doing so. But such reactions are short-sighted: when other people’s tools are being torn out and replaced by something less than desirable, bystanders may not feel any urgency to react or even think about showing any sympathy at all, but when tendencies exist to tackle other parts of an organisation with simplistic rationalisation exercises, who knows whose tools might be the next ones to be tampered with?
And from what we know from unfriendly solutions that shun interoperability and that prefer other solutions from the same vendor (or that vendor’s special partners), when one person’s tool or system gets the single vendor treatment, it is not necessarily only that person who will be affected: suddenly, other people who need to exchange information with that person may find themselves having to “upgrade” to a different set of tools that are now required for them just to be able to continue that exchange. One person’s loss of control may mean that many people lose control of their working environment, too. The domino effect that follows may result in an organisation transformed for the worse based only on the uninformed gut instincts of someone with the power to demand that something be done the apparently easy way.
Getting the Message Across
For those of us who want to see Free Software and open standards in organisations, the dangers of the top-down single vendor strategy are obvious, but other people may find it difficult to relate to the issues. There are, however, analogies that can be illustrative, and as I perused a publication related to my former employer I came across an interesting complaint that happens to nicely complement an analogy I had been considering for a while. The complaint in question is about some supplier management software that insists that bank account numbers can only have 18 digits at most, but this fails to consider the situation where payments to Russian and Chinese accounts might need account numbers with more than 18 digits, and the complainant vents his frustration at “the new super-elite of decision makers” who have decided that they know better than the people actually doing the work.
If that “super-elite” were to call all the shots, their solution would surely involve making everyone get an account with an account number that could only ever have 18 digits. “Not supported by your bank? Change bank! Not supported in your country? Change your banking country!” They might not stop there, either: why not just insist on everyone having an account at just one organisation-mandated bank? “Who cares if you don’t want a customer relationship with another bank? You want to get paid, don’t you?”
At one former employer of mine, setting up a special account at a particular bank was actually how things were done, but ignoring peculiarities related to the nature of certain kinds of institutions, making everyone needlessly conform through some dubiously justified, executive-imposed initiative whether it be requiring them to have an account with the organisation’s bank, or requiring them to use only certain vendor-sanctioned software (and as a consequence requiring them to buy certain vendor-sanctioned products so that they may have a chance of using them at work or to interact with their workplace from home) is an imposition too far. Rationalisation is a powerful argument for shaking things up, but it is often used by those who do not care how much it manages to transfer the inconvenience in an organisation to the individual and to other parties.
Bearing the Costs
We have seen how the organisational cost of short-sighted, buy-and-forget decision-making can end up being borne by those whose interests have been ignored or marginalised “for the good of the organisation”, and we can see how this can very easily impose costs across the whole organisation, too. But another aspect of this way of deciding things can also be costly: in the hurry to demonstrate the banishment of an organisational problem with a flourish, incremental solutions that might have dealt with the problem more effectively can become as marginalised as the influence of the people tasked with the job of seeing any eventual solution through. When people are loudly demanding improvements and solutions, an equally dramatic response probably does not involve reviewing the existing infrastructure, identifying areas that can provide significant improvement without significant inconvenience or significant additional costs, and committing to improve the existing solutions quietly and effectively.
Thus, when faced with disillusionment – that people may have decided for themselves that whatever it was that they did not like is now beyond redemption – decision-makers are apt to pander to such disillusionment by replacing any existing thing with something completely new. Especially if it reinforces their own blinkered view of an organisational problem or “confirms” what they “already know”, decision-makers may gladly embrace such dramatic acts as a demonstration of the resolve expected of a decisive leader as they stand to look good by visibly banishing the source of disillusionment. But when such pandering neglects relatively inexpensive, incremental improvements and instead incurs significant costs and disruptions for the organisation, one can justifiably question the motivations behind such dramatic acts and the level of competence brought to bear on resolving the original source of discomfort.
Thinking that putting down money with a single vendor will solve everybody’s problems, purging diversity from an organisation and stipulating the uniformity encouraged by that vendor, is an overly simplistic and even deluded approach to organisational change. Change in any organisation can be very expensive and must therefore be managed carefully. Change for the sake of change is therefore incredibly irresponsible. And change imposed to gratify the perception of change or progress, made on a superficial basis and incurring unnecessary and avoidable burdens within an organisation whilst risking that organisation’s independence and viability, is nothing other than indefensible.
Be wary of the “single vendor fixes it all” delusion, especially if all the signs point to a decision made at the highest levels of your organisation: it is the sign of the organisational panic button being pressed while someone declares “Mission Accomplished!” Because at the same time they will be thinking “We will have progress whatever the cost!” And you, not them, will be the one bearing the cost.
Losca | 09:58, Wednesday, 27 November 2013
BackgroundI upgraded from Linux 3.8 to 3.11 among with newer Mesa, X.Org and Intel driver recently and I found a small workaround was needed because of upstream changes.
The upstream change was the Add "Automatic" mode for "Broadcast RGB" property, and defaulting to the Automatic. This is a sensible default, since many (most?) TVs default to the more limited 16-235, and continuing to default to Full from the driver side would mean wrong colors on the TV. I've set my screen to support the full 0-255 range available to not cut the amount of available shades of colors down.
Unfortunately it seems the Automatic setting does not work for my HDMI input, ie blacks become grey since the driver still outputs the more limited range. Maybe there could be something to improve on the driver side, but I'd guess it's more about my 2008 Sony TV actually having a mode that the standard suggests limited range for. I remember the TV did default to limited range, so maybe the EDID data from TV does not change when setting the RGB range to Full.
I hope the Automatic setting works to offer full range on newer screens and the modes they have, but that's probably up to the manufacturers and standards.
Below is an illustration of the correct setting on my Haswell CPU. When the Broadcast RGB is left to its default Automatic setting, the above image is displayed. When set to Full, the image below with deeper blacks is seen instead. I used manual settings on my camera so it's the same exposure.
WorkaroundFor me the workaround has evolved to the following so far. Create a /etc/X11/Xsession.d/95fullrgb file:
And since I'm using lightdm, adding the following to /etc/lightdm/lightdm.conf means the flicker only happens once during bootup:if [ "$(/usr/bin/xrandr -q --prop | grep 'Broadcast RGB: Full' | wc -l)" = "0" ] ; then
/usr/bin/xrandr --output HDMI3 --set "Broadcast RGB" "Full"
Important: when using the LightDM setting, enable executable bits (chmod +x) to /etc/X11/Xsession.d/95fullrgb for it to work. Obviously also check your output, for me it was HDMI3.
If there is no situation where it'd set back to "Limited 16:235" setting on its own, the display manager script should be enough and having it in /etc/X11/Xsession.d is redundant and slows login time down. I think for me it maybe went from 2 seconds to 3 seconds since executing xrandr query is not cheap.
MiscNote that unrelated to Full range usage, the Limited range at the moment behaves incorrectly on Haswell until the patch in bug #71769 is accepted. That means, the blacks are grey in Limited mode even if the screen is also set to Limited.
I'd prefer there would be a kernel parameter for the Broadcast RGB setting, although my Haswell machine does boot so fast I don't get to see too many seconds of wrong colors...
Tuesday, 26 November 2013
DanielPocock.com - fsfe | 21:41, Tuesday, 26 November 2013
Here is an overview of the real-time market data architecture based on OpenMAMA:
All these components may be running on a single host or they may be distributed across different servers and workstations in a LAN. The OpenMAMA market data bus ties them together.
External data sources
These are external sources of data. These could be currency trading firms, stock brokers, bullion dealers or even Bitcoin exchanges. Some exchanges don't provide data services directly to the general public and they distribute their price data through third-party data vendors such as Reuters and Bloomberg.
The feed handlers manage the connections to external data sources. These are typically daemon processes that implement the wire protocols used by the external vendors for transmitting data over the public internet or a leased line. They typically use a point-to-point topology such as TCP connections.
The feed handlers publish the data into the market data bus.
In practice, the feed handler may be a proprietary application provided by the data vendor or it may simply be a Python script that fetches exchange rates from a URL every five minutes.
The market data bus is a distributed framework that is accessible to all of the local servers and workstations in the LAN.
At the lowest level a messaging middleware solution is used to transport the data. The Avis Event Router is a free middleware. There is upcoming support for Apache QPID as well. Some commcercial middleware is supported too - here is the full list
OpenMAMA does not provide a daemon or server process of its own. It is a set of libraries that operate on top of the middleware transport.
In very general terms, the OpenMAMA libraries let applications publish messages (such as price ticks) or subscribe to receive the messages published by other applications.
Spreadsheet application (for example, LibreOffice)
LibreOffice is introducing a new data import feature for streaming time series data. This appears to provide a useful integration point for OpenMAMA and discussion is taking place in the development community.
The spreadsheet can operate in various ways. A simple application may simply poll the currency prices when the spreadsheet is opened or refreshed. A more demanding application may see the spreadsheet come alive, recalculating all cells on every tick from the data vendor (several times per second). This latter scenario is more common in dealing rooms and hedge funds.
There are many free accounting applications today, including PostBooks and GnuCash and some more heavyweight solutions like Adempiere and OpenERP. In a world where credit cards and the world wide web have made international trade an everyday activity, many people are using this software to track expenses and accounts in more than one currency. For example, a British business may be paying some suppliers in Euros and a Canadian may be charging some customers in US dollars. A Swiss person may be keeping some of their savings in gold bullion in the vault of one of their world reknowned Swiss banks.
A common requirement for all of these users is the balance sheet. Using real-time market data sources, the balance sheet can be refreshed at any time from the live market prices. A home user may just find it convenient that they can open the balance sheet and always see their net worth immediately without having to manually cut and paste the currency prices from a web page. A business user operating in a competitive industry with low margins may be checking the balance sheet several times per day to ensure they remain solvent and profitable.
Many web shops now offer the customers the convenience of viewing prices in their own currency. It is important to make sure these prices are accurate, especially when dealing with volatile currencies or when the products have a low profit margin.
In fact, this web-based streaming price update mechanism is exactly how many online financial trading services offer live market prices to their customers.
Real-time valuation server
In a large organisation there may be many users looking at the same values. For example, in a trading desk, many users may need to see the desk's overall position in each market. In a busy web shop, many concurrent users may need to see the prices of common products.
Rather than recalculating these values for each user in parallel, a common solution involves setting up a server to receive raw values (such as currency prices) from the data feed, calculate values needed by local users and then broadcast those values over the market data bus.
How applications can integrate with real-time market data
Here are some steps for getting started:
- See my earlier blog on OpenMAMA for some very trivial code samples
- The OpenMAMA developers guide
- Sample code in the repository (for both C/C++ samples and Java samples)
- Build and install it from sources
- Use the packages on Debian or Ubuntu - RPM packages will hopefully be developed in the near future, most likely after the OpenMAMA build system update is complete.
Don't Panic | 16:46, Tuesday, 26 November 2013
Heute ein Gastbeitrag von Silvan, der mit mir zusammen die Fellowship-Gruppe Berlin organisiert. Vielen Dank!
Am 14.November veranstaltete die Berliner Fellowship-Gruppe ein erstes Treffen nach neuem Konzept. Die Grundidee dahinter ist künftig auf jedem Meeting drei stets wiederkehrende Blöcke zu gestalten:
- eine 15-minütige Zusammenfassung der FSFE-Aktivitäten des letzten Monats durch einen Mitarbeiter aus dem Berliner Büro
- ein einstündiges Thema das montalich wechselt. Eventuell eingeleitet mit einem kurzem Vortrag und anschließend moderiert von jeweils einem “owner” aus der Berliner Fellowship-Gruppe
- ein 15-minütiger Meta-Block mit Feedback sowie Planungen für das Thema und den owner des nächsten Meetings und weitere Aktivitäten.
Für das erste Treffen ging diese neue Struktur wunderbar auf, auch wenn die angestrebten Zeitlimits der Blöcke nicht einfach einzuhalten sind. In Zukunft werden wir hier entweder den Plan noch nachjustieren oder einfach disziplinierter auf die Zeit achten müssen.
Das erste Thema “freie Kalenderlösungen” traf jedenfalls schon einen Nerv. Silvan stellte in einer kurzen Präsentation mit Fokus auf freie CalDAV-Server (owncloud!) und zugehörige freie Android-Clients (aCal! CalDAV Sync Adapter! DAVDroid!) praktikable Lösungen für den Alltag vor und stieß auf reges Interesse. Das anschließende Gespräch ergab sich aus den eigenen Erfahrungen der Teilnehmer sowie akut brennender Themen wie “Client-Server-Architektur – hört dann nicht immer die NSA mit?” Außerdem die stets aktuellen Debatten über “Android – ist das überhaupt frei?” oder “Was ist eigentlich Linux-libre?” Die Diskussionen und dazugehörige sozialisierende Gespräche gingen wie üblich bis in den späten Abend und werden uns natürlich auch auf den nächsten Treffen weiter begleiten.
Als erstes positives Fazit können wir zumindest die Erkenntnis festhalten, dass unser “Meeting mit Thema” mehr Interessenten angezogen hat als die letzten unstrukturierten Treffen. Ob das eine feste Regel ist, wird die Zukunft zeigen. Erik wird beim nächsten Mal owner sein und bereitet dazu das Thema “Read-later/Read-offline” vor und wird uns seine Lieblingslösung Zotero präsentieren. Alle Fellows aus Berlin und Umgebung sind herzlich eingeladen!
DanielPocock.com - fsfe | 14:01, Tuesday, 26 November 2013
There is big news in Europe right now, especially the UK, about the Scottish nation's manifesto for independence.
I thought this was really amazing, a country that is going to move to free software and protect its citizens from horrors like DRM. I couldn't wait to find out more, for example, will they use Linux or *BSD?
After all, how many nations today can really consider themselves independent when they are trapped in the use of complex and opaque systems that require ongoing royalty payments to a foreign corporation, much like ancient villages paying their tithes to Rome? Today it is actually worse: these independent nations are not just sacrificing their cash - the new Rome is also using those same secret, proprietary systems to raid their subjects' privacy.
So I went to the manifesto's web site to get the details. I couldn't even find the words "software" or "technology" listed in the policy index on the front page - and software doesn't even appear in the search results. A section on Culture, Communications and Digital only really looks at the issues for the media (BBC after independence). They don't even make any comment about the British Telecom (BT) pension crisis and whether Scotland will abandon BT's services, leaving only those customers south of the border to continue subsidising the deficit through higher phone bills.
There is some hope, with the dot scot (.scot) top level domain coming in 2015 regardless of the referendum outcome.
Does this mean an independent Scotland is going to be run without technology at all then? That could be an interesting way to avoid the type of infiltration of communications technology that has been taking place in Brussels. Or do the people calling the shots simply fail to realise the impact of software and communications technology on a modern day concept of independence?
Party date set
For those who just care about gatecrashing the post-independence party, start booking your leave for March 2016.
Monday, 25 November 2013
I love it here » English | 20:31, Monday, 25 November 2013
David Wheeler wrote an interesting article about the economics of vulnerabilities. He fears that the current “‘vulnerability bidding wars’ [...] will create an overwhelming tsunami of zero-days available to a wide variety of malicious actors.” Beside describing some general problems of bounties in the security field, the main point of his article is the idea to increase security by criminalising the selling of “vulnerability information to anyone other than the supplier or the reporter’s government.”
About the effects of the vulnerability economics on Free Software Wheeler writes:
The current situation might impede the peer review of open source software (OSS), since currently people can make more money selling an exploit than in helping the OSS project fix the problem. Thankfully, OSS projects are still widely viewed as public goods, so there are still many people who are willing to take the pay cut and help OSS projects find and fix vulnerabilities. I think proprietary and custom software are actually in much more danger than OSS; in those cases it’s a lot easier for people to think “well, they wrote this code for their financial gain, so I may as well sell my vulnerability information for my financial gain”.
Seravo | 10:01, Monday, 25 November 2013
IT isn’t typically considered to be very environmentally friendly or cheap but The Finnish Association for Nature Conservation’s (FANC’s) strategy to use open source software has helped them achieve both.
The issue of environmentally friendly IT is actively discussed at the The Finnish Association for Nature Conservation (Suomen Luonnonsuojeluliitto). There are a number of web services at the core of their operations e.g. in managing the organisation, public releases and generally protecting the environment. By making smart choices, the FANC has been able to build web services all the while conserving the environment and even saving some money in the process. Here’s how they did it.
The key to this problem has been to make open source a strategic choice. All their servers are Linux systems running open source software exclusively.
But what makes Linux more environmentally friendly than say for example Windows servers?
Out with the old, in with the new?
The first issue with the environmental aspects of IT is the rather short lifespan of the hardware. We’re used to always getting the newest phones, computers and other devices only to toss them in the bin after only a couple of years of use. Manufacturing uses up a lot of energy and water, not to mention the all abundant use of rare earth metals and toxic chemicals.
Getting new devices at a yearly rate is also a matter of cost. At the end of their lifespans devices can’t be sold and reutilised due to a significant price drop in the market. Recycling electronics also requires energy intensive processes in large facilities.
The sad part is that the short lifespan of these devices usually isn’t due to the device breaking but rather a perception of the devices getting sluggish and outdated.
This is caused mostly by heavy marketing and fashion trends. Also playing a part is bad software that’s no longer compatible with older devices. The remedy for this at the Finnish Association for Nature Conservation has been to opt for open source software.
Linux, as all open source software get published on licenses that allow the users to freely use and develop the software. There’s no one there to limit the use of the software or to collect hefty license payments. With open source software, revenue is generated with support services that aim to increase the reliability of the system rather than just make it run faster on new hardware.
Thanks to the open source development model, the software supports a wide spectrum of hardware platforms. Even older machines get support; meaning that you can install a brand new Linux operating system on a three-year-old computer that wouldn’t install Windows 8.
Even when the device itself doesn’t get upgrades, free software updates satisfy the user’s insatiable tech-lust. With open source, often the software actually works faster after the update!
In the case of the Finnish Association for Nature Conservation, the expected lifespan of their server hardware exceeds six years, whereas typically you’re lucky to get more than three years of life out of your servers in similar situations. Currently their oldest machine is nine years old and still running. Some of their machines have got some of their components replaced, which is still better than replacing the entire machine.
Doubling your lifespan means more than a 50% decrease in hardware costs and in environmental damages. Due to there always being a backup machine ready to go, you can run your current machines until the very end, and then replace them. There’s no problems caused by an old machine dying and getting replaced thanks to a Linux clustering solution where tasks get automatically carried out by other available machines.
The FANC has also been able to take advantage of the low prices of last generation hardware. You can get brand new last generation machines for next to nothing thanks to the huge demand for only the newest machines. With Linux, you could easily still utilise three-year-old machines for up to three to five years.
Long lifespans also reduce effort. The Linux distributions the FANC uses all have five or seven year security update policies. This means that a properly set up machine can go all the way through its seven year lifespan without ever needing to completely reinstall the system.
With open source software there’s no excuse to avoid updates. All feature updates are provided free of charge, thus motivating the user to always keep their software up to date.
Down with power consumption
Another critical challenge with environmentally friendly IT is, of course, power consumption.
A Japanese study conducted in 2004, found that manufacturing a single desktop machine consumed three times as much electricity as a typical year of use with the end product. If you add up all other manufacturing-related processes, the number gets quadrupled.
Often it wouldn’t make sense to purchase a new, less power hungry machine just to try to be more environmentally friendly; however if the consumption in use exceeds the energy used manufacturing the new device, it might be feasible to purchase a more energy efficient machine.
Replacing old CRT monitors to new flat screen panels and going from desktop workstations to laptops will decrease your power bill significantly. The same applies to newer server machines that utilise solutions that decrease their power consumption and heat output.
Both actions have been taken to further optimise the FANC’s energy consumption.
“At the central office we’re trying to conserve power by all means possible. In public spaces for example, we removed half of the fluorescent tubes and nobody could even tell the difference. Naturally we take the environmental aspects of IT into account as well.”, said Pekka Saari, IT expert at the FANC.
Last summer, the FANC purchased second hand server machines that were both cheap and consume a fraction of their precursors energy use. According to their calculations, they’re conserving electricity at the yearly rate equivalent to a single bedroom apartment. The FANC’s servers all run on eco-energy.
“As the web content creator I’ve been happy with the decisions made here. I can safely estimate that IT in our office works better than in most larger scale organisations”, said Maija Lielahti, the web reporter.
This article was originally published in Finnish at VihreaTuuma.fi.
mina86.com | 00:24, Monday, 25 November 2013
“…and this pun with savegames in GTA.” my friend said laughing.
“What pun? What savegames?” I asked with a blank stare.
“You know, ‘Jesus saves’.” he explained looking at me like I'm crazy.
“Wait, you could save game in GTA?” I raised my brow in disbelief.
That's how I found out about savegames in GTA. This was years after I finished the game, twice, completing each city in one go. But as fun as GTA was, it wasn't the game I spent most time playing. That title goes to Doom 2. To this day, I put it at the top of all FPS games ever.
Why do I mention those old, long forgotten titles? Let's fast forward a little.
I don't usually buy games on a tangible medium. Doom 3, however, is one of the few that sits on my shelve. When I played it though, I had this uneasy feeling… Something wasn't right. Almost as if I didn't get what I had hoped for. The game was dark, and slow. Encounters with monsters jumping out from the darkness, were separated by journeys through dark corridors. Dim lights showed you the way in the darkness as you slowly progressed in each level, but did not reveal the danger, as that was hidden in dark corners.
I'm not a great writer, so that's probably why I overused “dark” in the above paragraph. To my defence, so did id Software. In some respects, Doom 3 felt like a technical demo of their shiny rendering engine. The team went overboard with all the new features, and instead of creating a successor of a game I love so much, they've created a poor attempt at duplicating System Shock 2 (a game which even with its “outdated” graphics is much more enjoyable and immersive then Doom 3)… or something… I don't know what exactly to be honest.
Fortunately, there's also Classic Doom 3. A mod created by Flaming Sheep Software. It is a remake of shareware levels of the original Doom, and it is wonderful. It's fast paced, with energetic music, there's no unneeded interruptions, no cut-scenes, and as far as I'm concerned, that's the game I wanted id Software to make. It's a pity that they didn't.
Still, I cannot say it was a bad game, and I did not, at that time, loose confidence in id Software's ability to create a great shooter. That's why, when I craved some good FPS, I got my hands on Rage, id's latest product.
So I'm starting the game. Standing by the exit of the Ark I check my grip on a mouse — I'm ready for my first encounter. Crouching, I enter the wilderness, watching my each step, I'm prepared to face the—
“Wait, what just happened?” I look at the screen baffled. “Oh, it's just a cut-scene, an extended intro I guess…” I take a sip of my tea while the protagonist sits in a car-thingy to escape danger with a newly found friend. A friend, who later leads to a realisation, that dialogues are not skippable — NPCs just keep talking, and one has to wait till they're finished to get a quest. Walking away looks like a valid work-around though.
More playing revealed that Rage is some kind of hybrid between Borderlands and Death Race, both of which are games worth recommending, but when id made them, it felt like another technical demo. As if they wanted to show how one can do both a shooter and a racing game using their engine. Pity they themselves couldn't. Riding the vehicles only inflated gameplay time without adding anything to the plot or actual shooter experience. The “technical demo” impression was reinforced by the fact that Rage took two short evenings to finish.
In the end I did finish Rage (even with all the boring time sinks) and this is more than I can say about GTA 4. Recalling all the fun I had playing GTA, I decided to take a look at it's latest version.
Admittedly, the intro was nicely done, and I really enjoyed it. It was not as good as Witcher's, but still, a calm pace and “cast” information nicely intermingled into the video, set a mood for a hopefully interesting game. Then, the developers decided to treat me like an idiot. I got used to games assuming I have no idea how to move, so with a slight sigh I got through this part. I still had hope.
Unfortunately, the more I played, the more I disliked GTA 4. Instead of having mission after mission, where I steal cars, run away from police, and run over innocent pedestrians, I was greeted with a ton of boring cut-scenes, a bowling simulator, a dating simulator, and other random content (none of which was particularly good). The things I loved GTA for so much were, however, hard to find.
Call of Doom
Are we really entering an age where games are all about slick graphics, heavily scripted locations and long cut-scenes? Playing various triple A titles I certainly feel like that's the case and that it progresses rapidly — for instance with Mass Effect I had a feeling that amount of cut-scenes increased and complexity of dialogues decreased with each sequel, and even though the last one reportedly had more dialogue lines than any previous, it was most likely because the dialogues were longer, not because player had more choices.
But my opinion is, of course, not the only one. Some say scripted FPS games make for a more immersive experience. That such games can be viewed as “interactive books”. So I gave this idea a try, and played through single player campaign of Battlefield 3.
Immersed was definitely not what I felt. More like annoyed.
For starters, each scripted encounter had its own one-time mechanics. On top of which developers could not even write a proper key binding configuration system. The end result was that I had to reply some of the scenes multiple times
Furthermore, characters in books can travel great distances in an instant. On one page they are here, on the next one, they are across the city. No problem. But in B3 it felt like the second most time consuming part of the game (after replying scripted encounters) was running mindlessly following some other character – if I really wanted to play Desert Bus, I would have done so…
Hope must lay with the proles
I sometimes feel we'd be better off without all the fancy graphics cards able to render nearly movie-like quality scenes. In a world where developers have to think of ways to interest player other than just adding more non-interactive content. This is of course exaggeration, and I'll be the first to admit that games like VVVVVV should have never been created, Notch manifested how lazy and/or incompetent he is by using graphics from 1980s, and that Another World 20th Anniversary Edition is even more enjoyable then it's original, but the point is that the game was fun to play even without those graphics.
But there seems to be something about having limited resources to develop a “next generation” graphical engine, that forces game developers to think of different ways to engage the player.
I've also had a pleasure of trying out Civilisation V and I'm saying “pleasure” despite me not being a huge fan of the franchise (probably because sadly I'm not that good at strategies). Even though the graphics and various screens were updated and some introduced a 3D animated figures, there were no intrusive, overly-long cut-scenes. No time sinks adding nothing to the gameplay. There was no overuse of the engine. The studio kept to the roots. They kept what made the game great only updating it wherever it made sense.
In the past I would make fun of games like FIFA or Need for Speed (admittedly I've never played the former and only a little of the latter) where it would seem that each new version is just updated graphics and new players/cars to choose from. But maybe FPS and RPG developers should look closely at such franchises and try not introducing more stuff whose only purpose is to look good? Like the meme goes, “try not adding another cut-scene on your way to the car.”
This section intentionally left blank.
Friday, 22 November 2013
Thinking out loud | 14:17, Friday, 22 November 2013
The format was interesting, it was a “speed geeking”. Seven tables, groups of 10 to 15 people and one speaker per table, presenting his/her project. We were supposed to “give the audience a broader idea of what open is”. I did seven times the same five minutes talk, presenting FSFE, Open Standards and Document Freedom Day.
My talk insisted on the aim (freedom), and not on the means (opening stuffs). After a short presentation of Free Software and FSFE, I switched to Open Standards and tried to explain why something even more technical and remote than software (standards) was also political.
Like with Free Software, questions around standardisation (or the lack of standardisation) often boil down to “who controls the technology we rely on?”. Since people in the audience where mostly open knowledge, open culture and open data advocates, I stressed and explained why to be archived in the long run and spread as widely as possible, data has to be in Open Standard.
The train tracks metaphor worked well to explain standards. If all train tracks have the same width, trains can run everywhere. If each 15km the width changes because a different company built it, long distance train lines can’t exist. From there it is possible to go back to IT, digital collaboration and freedom of choice. With a few more minutes, it would have been nice to extend the topic to DRM and other creative ways invented to lock users in (and why we shouldn’t accept them). All the topics we work on are linked, we can’t stress that enough.
The other talks sounded very interesting too, but I could unfortunately not listen to them. See Hacks Hackers Berlin, Africa Hack Trip, Open Bank project and Code for All from (de) Open Knowledge Foundation Germany.
Note: The room was full of women \o/ ! (..and full of macs…)
Looking forward to doing more with these communities!
/127.0.0.? Agile Workers Software » FLOSS Albrechts Blog Alessandro at FSFE » English Alexandre De Dommelin Alina Mierlus - Building the Freedom » English Being Fellow #952 of FSFE » English Bernhard's Blog Bits from the Basement Björn Schießle's Weblog » English Blog of Martin Husovec Blog » English Bobulate Brian Gough's Notes Carlo Piana :: Law is Freedom :: Ciarán's free software notes Colors of Noise - Entries tagged planetfsfe Commons Machinery » FSFE Communicating freely Computer Floss Creative Destruction & Me » FLOSS Daniel Martí's blog DanielPocock.com - fsfe Don't Panic » english planet ENOWITTYNAME Escape to freedom FSFE Fellowship Vienna » English Fellowship Interviews Fellowship News Frederik Gladhorn (fregl) » FSFE Free Software & Digital Rights Noosphere Free as LIBRE Free speech is better than free beer » English Free, Easy and Others Freedom Blog » Free Software From Out There GLOG » Free Software Gianf:) » free software Graeme's notes » Page not found Green Eggs and Ham Handhelds, Linux and Heroes Heiki "Repentinus" Ojasild » English HennR's FSFE blog Henri Bergius Hook’s Humble Homepage Hugo - FSFE planet I love it here » English Inductive Bias Intuitionistically Uncertain » Technology Jelle Hermsen » English Jens Lechtenbörger » English Jonas Öberg Karsten on Free Software Leena Simon» english Losca Mario Fux Mark P. Lindhout’s Flamepit Max's weblog » English Myriam's blog Mäh? Nice blog Nicolas Jean's FSFE blog » English Paul Boddie's Free Software-related blog » English Pressreview Saint's Log Sam Tuke's blog Seravo Software Livre com um toque feminino Supporting Free Software » English The trunk Thinking out loud Thomas Koch - free software Thomas Løcke Being Incoherent Thoughts in Parentheses » Free Software Tonnerre Lombard Torsten's FSFE blog » english Valentin Rusu » fsfe Viktor's notes » English Weblog Weblog Weblog Weblog Weblog Weblog Werner's own blurbs With/in the FSFE » English a fellowship ahead agger's Free Software blog anna.morris's blog ayers's blog blog blog.padowi.se » English drdanzs blog » freesoftware emergency exit free software blog freedom bits gollo's blog » English hesa's Weblog » Free Software irl:/dev/blog » fsfe-planet julia.e.klein's blog marc0s on Free Software mina86.com mkesper's blog » English nikos.roussos pb's blog pichel's blog rieper|blog » en stargrave's blog the_unconventional's blog » English things i made tobias_platen's blog tolld's blog wkossen's blog yahuxo's blog