Hacking the video stream for BlueJeans on Linux

Like most of the rest of the world, I'm working from home and stuck inside. I saw some folks who had virtual backgrounds setup on Zoom, and I wondered if something like that was possible for the videoconferencing service that my employer (Red Hat) uses, BlueJeans. The short answer is: No. Bluejeans has no native support for anything other than a regular video cam stream.

But this is Linux. We don't stop at the short answer.

I started thinking, surely, it has to be possible to "man in the middle" the video stream. And indeed, it is. I did all of this on Fedora 32 (x86_64), but it should work anywhere else.

Step 1: v4l2loopback

v4l2loopback is a kernel module which creates virtual V4L2 loopback video devices. V4L2 devices are what most (all?) webcams supported by Linux are.
This module is not in the upstream kernel, so you need to pull the sources from git and build it locally. The github home is: https://github.com/umlaeute/v4l2loopback.

Don't forget to install kernel-devel and kernel-headers that correspond to the running kernel on your system:
dnf install kernel-devel kernel-headers

Now, we need to clone the v4l2loopback source code, build it as a module for our kernel, and then install it:
[spot@localhost ~]$ git clone https://github.com/umlaeute/v4l2loopback.git
Cloning into 'v4l2loopback'...
remote: Enumerating objects: 65, done.
remote: Counting objects: 100% (65/65), done.
remote: Compressing objects: 100% (40/40), done.
remote: Total 1771 (delta 28), reused 43 (delta 19), pack-reused 1706
Receiving objects: 100% (1771/1771), 811.39 KiB | 8.11 MiB/s, done.
Resolving deltas: 100% (991/991), done.
[spot@localhost ~]$ cd v4l2loopback
[spot@localhost v4l2loopback]$ make
Building v4l2-loopback driver...
make -C /lib/modules/`uname -r`/build M=/home/spot/v4l2loopback modules
make[1]: Entering directory '/usr/src/kernels/5.5.0-0.rc7.git1.2.fc31.x86_64'
  CC [M]  /home/spot/v4l2loopback/v4l2loopback.o
  Building modules, stage 2.
  MODPOST 1 modules
  CC [M]  /home/spot/v4l2loopback/v4l2loopback.mod.o
  LD [M]  /home/spot/v4l2loopback/v4l2loopback.ko
make[1]: Leaving directory '/usr/src/kernels/5.5.0-0.rc7.git1.2.fc31.x86_64'
[spot@localhost v4l2loopback]$ sudo make install
make -C /lib/modules/`uname -r`/build M=/home/spot/v4l2loopback modules_install
make[1]: Entering directory '/usr/src/kernels/5.5.0-0.rc7.git1.2.fc31.x86_64'
  INSTALL /home/spot/v4l2loopback/v4l2loopback.ko
At main.c:160:
- SSL error:02001002:system library:fopen:No such file or directory: crypto/bio/bss_file.c:69
- SSL error:2006D080:BIO routines:BIO_new_file:no such file: crypto/bio/bss_file.c:76
sign-file: certs/signing_key.pem: No such file or directory
  DEPMOD  5.5.0-0.rc7.git1.2.fc31.x86_64
make[1]: Leaving directory '/usr/src/kernels/5.5.0-0.rc7.git1.2.fc31.x86_64'

SUCCESS (if you got 'SSL errors' above, you can safely ignore them)
[spot@localhost v4l2loopback]$ sudo depmod -a

Now, we can load the v4l2loopback module to create a virtual V4L2 video device:
[spot@localhost v4l2loopback]$ sudo modprobe v4l2loopback devices=1 video_nr=10 card_label="OBS Cam" exclusive_caps=1

You can change the card label string to whatever you want. This creates /dev/video10 and labels it as OBS Cam.
At this point, I played with pushing content to it via ffmpeg and seeing the result via ffplay, but while fun, this was not what I was going for.

Step 2: obs-v4l2sink

obs-v4l2sink is a plugin for OBS (Open Broadcaster Software) Studio that allows it to write video output to a V4L2 device. In order to build this, you need some more dependencies (and you need to have rpmfusion enabled):
sudo dnf install qt5-qtbase-devel obs-studio obs-studio-devel

Now, pull the source down from github (https://github.com/CatxFish/obs-v4l2sink):
[spot@localhost ~]$ git clone https://github.com/CatxFish/obs-v4l2sink.git
Cloning into 'obs-v4l2sink'...
remote: Enumerating objects: 94, done.
remote: Total 94 (delta 0), reused 0 (delta 0), pack-reused 94
Unpacking objects: 100% (94/94), 40.07 KiB | 683.00 KiB/s, done.
[spot@localhost ~]$ cd obs-v4l2sink/

Next, I had to hack one of the .cmake files so it would find the OBS cmake files from the rpmfusion package:
diff --git a/external/FindLibObs.cmake b/external/FindLibObs.cmake
index ab0a3de..7758ee3 100644
--- a/external/FindLibObs.cmake
+++ b/external/FindLibObs.cmake
@@ -95,7 +95,7 @@ if(LIBOBS_FOUND)
-       include(${LIBOBS_INCLUDE_DIR}/../cmake/external/ObsPluginHelpers.cmake)
+       include(/usr/lib64/cmake/LibObs/ObsPluginHelpers.cmake)
        # allows external plugins to easily use/share common dependencies that are often included with libobs (such as FFmpeg)

With that change, now, I could build this from source:
[spot@localhost obs-v4l2sink]$ mkdir build && cd build
[spot@localhost build]$ cmake -DLIBOBS_INCLUDE_DIR="/usr/include/obs" -DCMAKE_INSTALL_PREFIX=/usr ..
-- The C compiler identification is GNU 10.0.1
-- The CXX compiler identification is GNU 10.0.1
-- Check for working C compiler: /usr/lib64/ccache/cc
-- Check for working C compiler: /usr/lib64/ccache/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/lib64/ccache/c++
-- Check for working CXX compiler: /usr/lib64/ccache/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Libobs: /usr/bin/../lib64/libobs.so  
-- Configuring done
-- Generating done
-- Build files have been written to: /home/spot/obs-v4l2sink/build
[spot@localhost build]$ make -j4
Scanning dependencies of target v4l2sink_autogen
[ 20%] Automatic MOC and UIC for target v4l2sink
[ 20%] Built target v4l2sink_autogen
Scanning dependencies of target v4l2sink
[ 40%] Building CXX object CMakeFiles/v4l2sink.dir/src/v4l2sink.cpp.o
[ 60%] Building CXX object CMakeFiles/v4l2sink.dir/v4l2sink_autogen/mocs_compilation.cpp.o
[ 80%] Building CXX object CMakeFiles/v4l2sink.dir/src/v4l2sinkproperties.cpp.o
/home/spot/obs-v4l2sink/src/v4l2sink.cpp: In function ‘bool v4l2device_close(void*)’:
/home/spot/obs-v4l2sink/src/v4l2sink.cpp:217:1: warning: no return statement in function returning non-void [-Wreturn-type]
  217 | }
      | ^
[100%] Linking CXX shared module v4l2sink.so
[100%] Built target v4l2sink
[spot@localhost build]$ sudo make install
[ 20%] Automatic MOC and UIC for target v4l2sink
[ 20%] Built target v4l2sink_autogen
[100%] Built target v4l2sink
Install the project...
-- Install configuration: ""
-- Installing: /usr/lib/obs-plugins/v4l2sink.so
-- Up-to-date: /usr/share/obs/obs-plugins/v4l2sink/locale
-- Installing: /usr/share/obs/obs-plugins/v4l2sink/locale/zh-TW.ini
-- Installing: /usr/share/obs/obs-plugins/v4l2sink/locale/en-US.ini
-- Installing: /usr/share/obs/obs-plugins/v4l2sink/locale/de-DE.ini

If you're paying close attention, you'll notice that it installed the plugin into /usr/lib, and we need it to be in /usr/lib64. Move that file on over.
[spot@localhost build]$ sudo mv /usr/lib/obs-plugins/v4l2sink.so /usr/lib64/obs-plugins/v4l2sink.so

Step 3: OBS
Now, open OBS. You should see a selection under the "Tools" menu for V4L2 Video Output, this means the plugin is loaded:

Set the path to the virtual device we created in Step 1 (/dev/video10):

Click Start. The dialog doesn't go away, but it is running. You can close the dialog.
At this point, you need to add some sources. The first source you should add is a Video Capture Source. Click the "+" under Sources, and select Video Capture Device (V4L2). Create new, name it whatever you want. Hit OK. In the Properties dialog that follows, change the Device to your built in camera. NOT YOUR VIRTUAL DEVICE. Change any of the tunables you need here and hit OK.
Now, you should see the live feed from your webcam in the main OBS window. On my system it didn't take up the full space, so I dragged that box (from the bottom right corner) to fill the space:

You might also want to lock this source (click the lock next to the Video Capture Source), so that you don't accidentally move it around.

Step 4: Effects
If you skip this step entirely, you should be able to get BlueJeans working EXACTLY like it would normally, just with OBS in the middle. But that's not why we're here, so lets add two effects. First, I want to add a Red Hat watermark. I downloaded a PNG of the logo with transparency, then I add an additional Source, this time it is an Image type. I renamed it to Red Hat Logo and hit OK. The Properties dialog prompts me to select the file, I do, then hit OK. You should see it over top of your webcam video feed now (if not, reorder the Sources list so that it is above the Video Capture device in the list). You can lock the logo's position or make it invisble by clicking the lock or eye next to it in the Sources list. BlueJeans adds a small grey overlay to the bottom of all video windows, so I put the logo on the top of mine, but you can move it around until you have it where you like it.

Now, I wanted to add a fun effect, so I downloaded a short video of a butterfly flying across a black background: Video used under license from Freestock.com Once downloaded, I added a third source, this time a Media source. I renamed it to Butterfly and hit OK. In the Properties dialog, I selected the file. I also told it to loop and use hardware decoding when available, then hit OK. You should see the butterfly flying happily in the top left of your feed.

We need to do one more thing, we need to set the black background in the video to be invisible. We do this by adding a Color Key Effect Filter. First, stretch and move the butterfly source so that it is where you want it. Then, right click on the Butterfly source and select Effects. Hit the plus under Effect Filters and select Color Key. Now, change the Key Color Type to "Custom Color" and select black (#ff000000). You should now just see the butterfly in the preview window. Make sure you have the Opacity at 100%, we want to see our Video Capture source behind the butterfly! Hit Close and you should see the butterfly flying by!

I set it to loop because I want to be able to turn the butterfly on (and off) as needed, and I can do that by toggling the visibility (clicking the eye next to the Butterfly source). If it was not looping, it would play once and stop, whether or not it is visible.

Step 5: BlueJeans
NOTE: I could not get this to work with Google Chrome/Chromium. There are lots of people who have posted to the internet trying to get help to reliably change webcam from the default (first found device) in Chrome/Chromium without much success. So, I used Firefox. You need to be sure you haven't given BlueJeans blanket permission to use the webcam, or it will keep trying to use your built-in device. Click on the little "toggle" next to to the URL and make sure there is no permissions for the camera under "Permissions".

Now, when you open BlueJeans, it will prompt you to give it permission to use the camera and microphone. Be sure to select OBS Cam (or whatever you named your virtual device). You can check the remember box now. I did not, because there are some meetings where the other participants may not appreciate a water mark (or they're all Mac users and none of this works for them). Hit Allow, and you're off! Everything else is BlueJeans as normal.

Step 6: Notes
When BlueJeans shows you your webcam video, it flips it. This is not how anyone else sees it, don't panic or flip it in OBS. On my system, this works in near realtime, and it only uses about 9% of the CPU. I keep the OBS window open on a separate monitor, so that I can trigger effects during the call if I want to.

The butterfly video isn't perfect, the Color Key eats out a bit of the body. A better video (with transparency/color key in mind) would solve this issue.

You don't need to hit "Start Streaming" or "Start Recording" in OBS. The plugin we built, installed, configured, and enabled is piping the video from OBS to /dev/video10. If you want to stop, you just go back to Tools->V4L2 Video Output and hit Stop (or close OBS).

Oh, remember when I mentioned a virtual background? Well, you can totally do that with this technique, but you really need a solid background, ideally in a color that never shows up on you or your clothes. In the film business, they use "green screen" (aka chroma key or color key) to accomplish this. I don't have a green screen or even a single color wall in my office, so I couldn't do anything else here, but if you do, you can add a Color Key Effect filter to the Video Capture source (right click it) to remove the "background". Then it will become transparent and you can add an image or video source and layer it appropriately.

If anyone comes up with a clever way to create a virtual background without the need for a "green screen", please share it!

Musings on business models for open source software

Note: I use "open source" here, but I really mean "the amalgamation of open source and free software as defined by the Open Source Definition and the Free Software Foundation, respectively". It was the shortest thing to type.

I have been thinking a lot about open source from a business perspective lately. While the statement that “open source is not a business model” is true, to frame my thoughts I started with categorizing the existing business models for open source software companies.

  1. Consultancies

    These companies build open source solutions for their customers on contract, often around existing open source technologies. Examples of this model are Igalia, Collabora, Open Source Consulting Group, but other companies often have open source consulting offerings in addition to their primary business model.
  2. Subscription Offerings

    Every piece of code that these companies produce is open source. Usually, these offerings target technologies and solutions for Enterprise use cases (Server OS, Cloud, Middleware, Storage, Virtualization, Containers). Companies here sell customers annual subscriptions which provide support, updates, and access to all current binary releases of the software. Source is available to everyone. Customers do not pay for “right to use”. Examples of this model are Red Hat, WSO2, OpenNMS, Heptio (now VMware), Hortonworks (now Cloudera), Canonical, Univention
Read more...Collapse )

FOSDEM and Fedora

I have been going to FOSDEM on and off for the last nine years or so, and I have somewhat notoriously described it as 10 pounds of conference in a 5 pound sack. It is the sort of conference where European FOSS contributors and users can be assumed to be present (and lots of other international ones as well). I like FOSDEM, I like that it is community run, I like the wide range of topics it covers, and I even like Brussels (I mean, beer, cheese, and chocolate, how can you go wrong). I do wish they would find a larger venue (my 6'4" body does not fit into most of the seats that the devrooms use), or a longer duration (there are too many devrooms going at once, and too many people who wish to be in those devrooms), but I don't expect these things to change.

This year, I spoke at the CentOS Dojo before FOSDEM, the first time I have done so. I spoke on the topic of using the Developer Toolset to build modern code on CentOS/RHEL, specifically in my case, Chromium. There wasn't a lot to say on it, to be honest, since it can be summarized as "Chromium uses C++14 features not in the CentOS gcc, using Developer Toolset makes it build", but I tried to make it interesting with pictures and details.

I helped out at the Fedora booth during the first day of FOSDEM, though, we had so many Fedora ambassadors, there was not a lot for me to do. We had our usual table right by the doors (which meant it was cold). While I thought we did fine, there were a few things that we could have done better:

Read more...Collapse )

Fedora at Midwest Rep Rap Fest 2015

I attended Midwest Rep Rap Fest 2015 this weekend, in Goshen, Indiana. Goshen is about 45 minutes outside of South Bend (the nearest regional airport). This part of Indiana is noteworthy for a few reasons, including the fact that Matthew Miller, the Fedora Project Leader, is from there. It also has a very large Amish population, which makes it one of the few places I've attended a conference where most of the local businesses have a place to tie up your horses. The Midwest Rep Rap Fest is an event dedicated to Open Source 3d printers (and their surrounding ecosystem). The primary sponsor of the event is SeeMeCNC, a local vendor that makes open source hardware delta 3d printers. A Delta printer is a 3d printer with a circular stationary bed. Attached to the bed are three vertical rods which serve as tracks for three geared motors. The motors move up and down the rods, and are connected to a central extruder which hangs down the center. The extruder is moved in three dimensions by moving the supports along their tracks. Watching a Delta 3d printer do its thing is pretty amazing, it seems to dance like a trapeze artists as it dips and swoops to print the object.

The Delta type of 3d printer was the most common printer at the event, many people had either bought SeeMeCNC printers or had built their own off their open source design. The SeeMeCNC team brought their super-sized Delta, which they think is the largest Delta printer in the world. It was easily 30 feet tall and barely fit in the building we were using (which is saying something, because we were in an exhibition hall at the local state fairgrounds). The owner of the company decided to see how big of a Delta printer he could build, and this was the result!

The printer used a shop vac to blow plastic pellets up a plastic hose into the giant heated end. Originally, they were trying to print a giant model of Groot (shown in progress in my picture above), but they had to leave it running overnight on Friday and when we came back Saturday morning, the print had failed because it had run out of plastic pellets! Later on, they printed a very large basket/vase with it (after fixing it so that it wouldn't run out of plastic).

Fedora had a table in the main room. I brought two open source 3d printers from Lulzbot and controlled them both from my laptop running Fedora 21. My larger printer, the Taz 4, was configured with a dual extruder addon, and I spent four hours on Friday calibrating it to print properly. On Saturday morning, I printed my first completely successful dual color print, a red and white tree frog!

The eyes didn't come out perfect, but it all came out aligned and in one piece. Several people offered me tips and advice on how to improve the print quality with the dual-extruder setup. One of the nice things about the Rep Rap fest was the extremely friendly nature of the community. Everyone was eager to help everyone else solve problems or improve their printers/prints. I used Pronterface to control the Taz 4, since it was better suited to handle the dual extruder controls.

My smaller printer, the Lulzbot Mini, was controlled with Cura-Lulzbot (a package which got added to Fedora a few days before the show!). Cura has a very fast and high quality slicer, but with less options for tweaking it than slic3r (the traditional open source slicing tool) does. 3d printers depend on a slicing tool to take a 3d model and convert it into the GCode machine instructions that tell the printer where to move and when to extract plastic. Cura also has a more polished UI than Pronterface.

The Lulzbot Mini is able to self level, self clean, and self calibrate, which almost eliminates the prep time before a print! One of the vendors at the show was Taulman, who is constantly innovating new filaments for 3d printing. They announced a new filament the weekend of the Rep Rap Fest, 910, and they gave me a sample to try out on the Mini. The Mini can print filaments with a melting point of 300 degrees Celsius or less, so it was well suited for the 910. 910 was interesting because it was incredibly strong, almost as good as polycarbonate! It was also translucent, which made it ideal for me to finish a project I've been working on for a long time: my 3d printed TARDIS model!

I printed four window panels and a topper piece for the lantern on the roof. A few other people had TARDIS models (including one that had storage drawers inside it), but mine was the biggest (and I think, the nicest).

One of Fedora's neighbors was mUVe, an open source SLA 3d printer. SLA 3d printers use a liquid resin and a DLP projector to make incredibly accurate 3d models that would be difficult or impossible to print on other kinds of 3d printers. It seemed like everyone was printing the same Groot model at the event, and they printed one that came out looking incredible. The inventor of the hardware was working their table, and we talked for a while about the importance of open source in hardware. He felt strongly that it was mandatory for him to release his work into open source so that other people could innovate and improve upon the designs he'd created. The mUVe printer was one of the largest SLA printers I've ever seen and the quality of its prints was amazing. The biggest downside is the complexity, it involves chemicals in the resin and in curing the prints once they have finished, but in my opinion, it was worth it. The cost was in the $1500-2000 price range, but he said he's working on something awesome that will bring that cost down. They used Creation Workshop to slice and control their printer, which was new to me, but it was also open source. It's C# though, but I want to see if I can get it working in Mono on Fedora. (They were also in the greater Detroit area, so I encouraged them to come out and demo it at Penguicon!)

Another neighbor had 3d printed an amazingly intricate "home clock". They had used a famous woodworking pattern, converted each of the pieces to a 3d model, then printed them. Each piece was then smoothed and attached together. The only piece they didn't print was the clock at the center! On the table, the top of the clock was taller than me (and I'm 6'4"). It didn't look 3d printed, it looked too nice! It took them 3 months to print it all. The owner said that if you're able to cut this model from wood and assemble it properly, you're considered to be a master in their community. Everyone was definitely in awe of it in this community.

It seemed like everyone showing off something at this event had a clever hack of their own. Some people were creating amazing models, some people had built new open source printers. One printer had color changing LED strips attached underneath it which changed from red to green to indicate the progress of the printing job. Another printer had a Raspberry Pi with camera wired into it so you had a "printer's eye view" as it printed. There was a custom 3d scanner designed to scan people's heads and torsos to make printable busts. There was even a printer that looked like some sort of industrial robot gone mad! The one thing these all had in common? They were open source. No one here was questioning open source, it was just the way they operated, sharing what they knew and building off each other's successes (and failures). There were a few MakerBot Replicators, but all of them had been hacked in some way.

Attendance at this years event was both up and down. There were more people and companies exhibiting at the event, including Texas Instruments, Hackaday, Lulzbot, Taulman, and Printed Solid. Printed Solid was giving out free samples of some amazing ColorFabb filament. I came home with some BronzeFill (prints into a bronze like material that when polished is heavy and shiny), a new flexible filament, and some carbon-fiber infused filament! They also had some really fantastic glow in the dark filament, but no samples of that were available (and I didn't have the spare cash to buy a full spool). General attendance at the event was about 750 people, which was down from last year (around 1000). The general consensus was that the event wasn't doing all it could to advertise itself, and the location wasn't exactly optimal (45 minutes from the nearest regional airport, almost 2 hours from a major airport). The majority of visitors were local to the Indiana/Michigan area. The event staff said that next year they plan on rebranding the event to a more general FOSS 3d printing event (not limiting themselves to the Midwest region of the US). I think that is the right decision, since they are the only open source 3d printing event that I'm aware of, and I'd really love to see them grow into something bigger and more accessible.

Oh, did I mention we had a celebrity at the event? Ben Heck was there with his Delta printer! He's built a pinball machine. I might want to be him a little bit (but I'm not). He was very friendly and cool, spent a lot of time talking to the other makers and attendees.

Thanks to Ben Williams, Fedora had a very nice booth setup. We had our Fedora tablecloth and lots of stickers to give away. I brought a good sampling of models I'd printed with Fedora and my 3d printers, and I had a lot of good conversations about using Linux and open source to power 3d printing and 3d model creation. My coworker (and celebrity writer) Brian Proffitt stopped by on Saturday and helped out at the table for a while. I was supposed to have Fedora 21 media to hand out, but the promised shipment never arrived. The computers there were a mix of Windows and Linux, very few Macs in this community. Several people were using Fedora, but most of the Linux instances were Debian.

The Fedora event box needs a little love, there wasn't very much in it that was useful anymore. The OLPC in it is very old now, and since the current OLPC hardware runs Android these days, it isn't as "cool" as it used to be. I restocked it with Fedora bubble stickers, but it probably needs a plan to revitalize it.

All in all, it was a very fun weekend event and a great opportunity to connect with the open source 3d printer community. I think it is the responsibility of Fedora (and Red Hat) to reach out to the maker communities and help them be open source in their own ways, and this was an excellent opportunity to do exactly that. Is there a Maker event happening somewhere near you? You can sign up to represent Fedora at that event like I did at MRRF: Fedora Event Calendar

Chromium revisited

It's been more than a year since I've had a successful build of Chromium that I was willing to share with anyone else, but last night I pushed out a Fedora 20 x86_64 build of the current stable Chromium. Here's where you can go and get it:

1) Get this repo file: http://repos.fedorapeople.org/repos/spot/chromium-stable/fedora-chromium-stable.repo and put it in /etc/yum.repos.d/
2) EDIT - I've signed the packages with my personal GPG key, upon request. This means you also need to download my public key. You can either get it from:
or by running:
gpg --recv-key 93054260 ; gpg --export --armor 93054260 > spot.gpg
Then, copy it (as root) to /etc/pki/rpm-gpg/
cp -a spot.gpg /etc/pki/rpm-gpg/
3) Run yum install chromium chromium-v8

Why has it been so long?

A) I really believe in building things from source code. At the very least, I should be able to take the source code and know that I can follow instructions (whether in an elaborate README or in src.rpm) and get a predictable binary that matches it. This is one of the core reasons that I got involved with FOSS oh so many years ago.

B) I'm a bit of a perfectionist. I wanted any Chromium builds to be as fully functional as possible, and not just put something out there with disabled features.

C) Chromium's not my day job. I don't get paid to do it by Red Hat (they don't mind me doing it as long as I also do the work they _do_ pay me for).

D) I also build Chromium using as many system libraries as possible, because I really get annoyed at Google's policy of bundling half the planet. I also pull out and package independently some components that are reasonably independent (webrtc, ffmpegsumo).

If my schedule is reasonably clear, it takes me about 7-10 _days_ to get a Chromium build up and going, with all of its components. Why did it take a year this time around? Here's the specifics:

AA) I changed jobs in November, and that didn't leave me very much time to do anything else. It also didn't leave me very motivated to drill down into Chromium for quite some time. I've gotten my feet back underneath me though, so I made some time to revisit the problem....

BB) ... and the core problem was the toolchains that Chromium requires. Chromium relies upon two independent (sortof) toolchains to build its Native Client (NaCl) and Portable Native Client (PNaCl) support. NaCl is binutils/gcc/newlib (it's also glibc and other things, but Chromium doesn't need those so I don't build them), and PNaCl is binutils/llvm/clang/a whole host of other libs. NaCl was reasonably easy to figure out how to package, even if you do have to do a bootstrap pass through gcc to do it, but for a very long time, I had no success getting PNaCl to build whatsoever. I tried teasing it apart into its components, but while it depends on the NaCl toolchain to be built, it also builds and uses incompatible versions of libraries that conflict with that toolchain. Eventually, I tried just building it from the giant "here is all the PNaCl source in one git checkout" that Google loosely documents, but it never worked (and it kept trying to download pre-built NaCl binaries to build itself which I didn't want to use).

** deep breath **

After a few months not looking at Chromium or NaCl or PNaCl, I revisited it with fresh eyes and brain. Roland McGrath was very helpful in giving me advice and feedback as to where I was going wrong with my efforts, and I finally managed a building PNaCl package. It's not done the way I'd want it to be (it uses the giant git checkout of all PNaCl sources instead of breaking it out into components), but it is built entirely from source and it uses my NaCl RPMs. The next hurdle was the build system inside Chromium. The last time I'd done a build, I used gyp to generate Makefiles, because, for all of make's eccentricities, it is the devil we understand. I bet you can guess the next part... Makefile generation no longer works. Someone reported a bug on it, and Google's response is paraphrased as "people use that? we should disable it." They've moved the build tool over to something called "ninja", which was written by Google for Chromium. It's not the worst tool ever, but it's new, and learning new things takes time. Make packages, test packages, build, repeat. Namespace off the v8 that chromium needs into a chromium-v8 package that doesn't conflict with the v8 in Fedora proper that node.js uses. Discover that Google has made changes to namespace zlib (to be fair, its the same hack Firefox uses), so we have to use their bundled copy. Discover that Google has added code assuming that icu is bundled and no longer works with the system copy. Discover that Google's fork of libprotobuf is not compatible with the system copy, but the API looks identical, so it builds against the system copy but does not work properly (and coredumps when you try to setup sync). Add all the missing files that need to go into the package (there is no "make install" equivalent).

Then, we test. Discover that NaCl/PNaCl sortof works, but nothing graphical does. Figure out that we need to enable the "Override software rendering list" in chrome://flags because intel graphics are blacklisted on Linux (it works fine once I do that, at least on my Thinkpad T440s, your mileage may vary). Test WebRTC (seems to work). Push packages and hope for the best. Wait for the inevitable bugs to roll in.


I didn't do an i686 build (but some of the libraries that are likely to be multilib on an x86_64 system are present in i686 builds as well), because I'm reasonably sure there are not very many chromium users for that arch. If I'm wrong, let me know. I also haven't built for any older targets.


FOSDEM is pretty much _the_ European community FOSS event. I've been going on and off for a few years now, but in the last few years, it has had a dedicated Legal devroom, and I really enjoy that aspect of it. I spoke in a short session in the Legal devroom on H264 and Cisco's donation of openh264. I thought that talk went okay, but every time I give a new presentation, I immediately realize 10-20 ways I could have improved it (even if I never give that talk again). Afterwards, someone from Mozilla came over to argue that the Cisco release of openh264 was a net win for FOSS and Linux distros, and I think we had to agree to disagree on that point. His point eventually boiled down to "we're losing users to Chrome, we desperately need openh264 to compete", which is a bit like me saying "Fedora is losing users to other distros, we desperately need non-free software to compete". Ahem.

Anyways, I was also on a panel about Governance in FOSS communities, which I thought went well, even if most of us on the panel were not entirely sure whether we were qualified to speak on that topic. :) Karen Sandler had some good questions, as did the audience, and it was a packed room.

Not to take away from any packed room, but FOSDEM has really really outgrown its venue. The Université libre de Bruxelles is nice, and it is free (or mostly free from what I hear), but 3 out of 4 sessions I'd have liked to see were full before I even had a chance. They need a lot bigger rooms (or more days with repeat sessions).

I also brought a Lulzbot Taz 3 3d printer with me, but because I'm an idiot (and assumed an auto-switching power supply), I cooked the power supply in the first hour. Later, we thought we had a working power supply replacement, but it was a 110V (and the Taz 3 really needs a 230V supply). Thankfully, the Fedorans had brought some Rep Rap printers, so we had 3d printing the whole time, just not on the Taz 3 so much. Lesson learned. Lulzbot donated that Taz 3 (and a replacement power supply) to hackerspace.be.

I had a lot of good hallway discussions with people (there were a larger than normal contingent of US Fedora people around because of devconf.cz, which was a week after FOSDEM, but I opted out this year), and a good sampling of delicious Belgian beer. After FOSDEM, I flew to Prague for two days, to scope out the venues for Flock 2 (Electric Boogaloo).

Changing the GNOME 3.10 lock screen art

I upgraded to the Fedora 20 work-in-progress tree on my laptop recently, and one of the first things I noticed about GNOME 3.10 was the new lock screen. Specifically, this:

My wife walked by, saw it, and asked me "what is that ugly thing on your computer?" I was forced to simply shrug in agreement.

Now, I admit, I do not possess the finest of tastes. My tastes are less caviar and champagne and more bacon and beer, but that's fine for me. I know what I like, and I know that this pink spray is not it. I figured there might be others who don't appreciate the choice of artwork here either, so I took a few moments and figured out how to change it.

If you go into the gnome-control-center (accessible from the top right menu, then clicking the "tools" bubble), then click on "Background", it will open a window where you can change Background and Lock Screen. Clicking on Lock Screen brings up an interface where you can choose from a set of art (either Wallpapers, Pictures, or Colors) that has been hardcoded by the GNOME upstream for your viewing pleasure. Pick one of these pictures and the Pink Panther nightmare is gone.

... but what if you want a custom picture there? I like to customize my desktop. I changed my background to one of the optional F19 backgrounds (the one with the tree frog), and I've been using a custom plymouth plugin for the bootsplash with an animated Hypnotoad for a while now. I found a clever picture of the Futurama splash screen on the internets:

Okay, so maybe that's only funny to me, but I kept thinking that the Pink Volcano reminded me of that. Download a copy of that file (or whatever JPG or PNG you like), and put it into ~/Pictures. Now, it will show up in the gnome-control-center tool when you restart it. Then you can select your custom lock screen.

I can't help but wonder why they do not have a file selector here. Seems like the sort of thing that would be nice. Maybe it is coming soon.

EDIT: Documented using ~/Pictures instead of altering the hardcoded wallpaper XML list.

Fedora running UNDERWATER

I've been very very busy lately writing frantically for the Raspberry Pi Hacks book (OMG deadlines everywhere), but I made a little video to go with one of the hacks, and I thought people might be interested in it.

I painted a Raspberry Pi with NeverWet paint and ran it completely submerged in a container of water, with no case. Running Pidora, of course.


I apologize for the video quality. Cell phone was all I had handy.

Updated scribus packages

At the request of Mo Duffy, I've updated my scribus 1.5 packages to the latest SVN trunk code, and built them for Fedora 18 and 19 (the currently stable releases of Fedora).

I tweaked the layout a bit, but if you want to use these packages, it should be as easy as downloading the new repo file here:


Put it into /etc/yum.repos.d/

Then, yum update scribus.

I mostly do this as a convenience to Mo, but if you poke me politely, I can probably be coerced into updating them from time to time.

Running todo list

I've been trying to shove everything that needs doing into a running todo list, but I'm sure I'm missing something.

This is your chance to point out anything that you think I need to do, but haven't done (or shown any signs of doing). No need to point out Chromium, I know about that one.