I recently purchased a new Logitech wireless keyboard for my kitchen computer because the Bluetooth keyboard I previously used was driving me nuts. Mostly for the keyboard layout and sometimes because it didn’t want to connect. Possibly due to hardware failure or bad design. It also doesn’t have media keys so I thought it best just to replace it.
I have previously used ltunify with success but I only used it because “L” comes before “S” so that was my first stop. Since I received feedback that I should try Solaar I did so this time. Since there isn’t an official Linux based application available from Logitech, the fine open source community has stepped in to make managing your devices simple and straight forward.
Once it is installed, launch it using your preferred method, the menu, Krunner, etc.
Right off the cuff, this is a more user friendly application with some additional features. For starters, whatever devices you have connected to your Logitec receiver will display a battery status. In this case below. I have a keyboard and mouse already paired with the Unifying receiver.
Logical layout of the device listing, and verbose device information, device options and battery status. What is nice about this application is having the ability to modify the status of the device. My K400 Plus keyboard has the Function Keys and the media keys set up as such that by default, they are media keys. This is not what I prefer so I can Swap the Fx function here.
Pairing A New Device
My reason for using this application was to pair my new keyboard with an existing receiver. I don’t see the value in having more than one USB port populated unnecessarily. To Pair a new device is very straight forward, select the root “Unifying Receiver” and select “Pair”. The dialog will pop up and ask you to turn on the device you want to pair.
When you do that, the receiver will grab the new device, associate it and have it available to be used.
That is all there is to it. Each device will have their own set of options that are adjustable per your preferences. This Performance MX Mouse has more options than the value M175 Mouse.
That just about does it for Solar. There are some other fun features like getting device details but I don’t really want to post those here because I don’t really know if that is information I should be sharing!
Having Solaar in the system try is quite handy. Though, the reality is, I don’t need it all the time but having it to manage your devices is very handy. It’s nice to know that you can manage multiple Unifying receivers with this application. This is easy to use and has a great, well laid out and straight forward interface. I am glad I was recommended to try this application out.
I recently posted about my computer build. In short, this is a computer build on parts that are in no way considered top of the line. They are all quite old and that did pose a few problems. One, this motherboard would not boot from a software RAID pool. I was able to bootstrap the BTRFS RAID pool with a separate drive and root partition. It did add some complexity to my system but I think it works out okay.
Building a system is something I have wanted to do for quite some time. As in, several years but time, finances and decision vapor-lock had kept me from it. What pushed me over was a fortuitous conversation at a Christmas gathering last year, I struck a nerdy conversation, with a computer store owner that ultimately gave me this giant Thermaltake case without a motherboard and a few weeks later, another fortuitous happening where I was given a bunch of old computer equipment and an AM3 motherboard was among the rest of the aged equipment which drove the rest of the build. My course of action was to stuff the most memory and fastest processor in that which is what I did and I am happy with it. I am not going to belabor that process as I have talked about it before and I have a link you can follow if you are interested in those details.
As a result of this, I had tons of fun, it was a great learning experience and that same guy gave me another case, not as big but far more robust in design with a water cooler. I now want to build another machine but I am thinking a more pure gaming machine and leave this current machine to be my server workstation. I don’t know when I would get to this but I think this one will be a project I do with my kids. Use it as a teaching opportunity and turn it into a kind of family event. Currently, the machine has a Core 2 Duo CPU platform of some kind. I think I would probably do another AMD build, something newer that can take advantage of these new fancy CPUs coming out. I still wouldn’t go bleeding edge but certainly something closer than what I have now.
I have fully evaluated my use of Emby and given a little write up on it. I described the installation process, setting it up, importing my media files and so forth. I want to just summarize the highlights and the lowlights of my experience before I begin my next testing of Plex.
What I like
Emby is super easy to set up. It is nothing more than copying one line into a terminal and executing it. It is super simple and the script also seems to, at least on the version I installed, start and enable the emby-server service.
It’s super easy to add media libraries to Emby. The wizard walks you through it in the beginning and if you want to add additional libraries, that is very easy to do through the configuration tool.
Streams to just about everything in the house. Essentially, if it has a browser, you have access to the Emby server. I haven’t had any issues with the system in the approximately five weeks I have been using it.
Updating the metadata and identity of any movies is as easy as a click and search. You can change the cover images and so forth. Some of the movies I have ripped haven’t always been detected completely correctly. For example, there are three different Grinch movies and I had to manually define which decade they came from. It was super easy.
The Android application works quite nicely. I am actually impressed with the ease of use of the application. It also has quite the fine polish to it as well.
What I don’t like
This was an open source project that went closed source. I sort of have an issue with that and I am not alone with that assessment. It was at that point that Jellyfin was forked from Emby which is what makes me incredibly interested in Jellyfin.
I can’t stream to my Wii, though I don’t really blame the project for not supporting a 14 year old game console. There isn’t an app on the Homebrew channel though at the time of writing, I realized that there is a browser on the Wii so perhaps more investigation is needed. I will update this paragraph with any new information I learn as I investigate that possibility.
Updates will have to be done manually. The server does say it needs to be updated and to do so requires the same step as installation. That is really the only clunky part about this whole setup.
Emby is pretty great. Regardless of what I do not like about it. It is a great experience. If you are undecided on your media server and have a desire to try the different options, this is a good one. If this was my only option, I could easily get along fine with it. Since I have two others, I will check those out too.
I highly recommend you try out Emby as the shortcomings are nitpick issues. I don’t like that it went closed source but the project, closed or open, is sound. It is a great, well polished, experience.
This is my first media server review. I will have follow up articles to this in the near future. If there are any inaccuracies or areas I need to revisit, please let me know and I will take the time to make updates.
It did take me a quick start tutorial to get going. I do kind of wish there were more instructions on how to do things that weren’t in video form. I like video and all but it is too slow to go through. I would rather scan down a page and see little clips of how each effect is done on it’s own. I suppose there is nothing stopping me from doing that.
Kdenlive is easy to just get going with it. Once you understand the work flow, dump your videos, music, pictures and such in the “pot-o-media” and you are off to the races.
What I Like
Kdenlive is incredibly stable and reliable. Crashing is incredibly rare. I have spent many hours at a time editing and not once has Kdenlive crashed. In all fairness, it’s been hours of editing because I am not very good at it. I have used and rendered video on both my Dell Latitude E6440 and my “new” AMD FX-9590 system with out any glitching or issues. I am impressed by the stability and smooth operation of Kdenlive on openSUSE Tumbleweed.
The user interface of Kdenlive makes sense. The shortcuts, the ease of defining the effects and transition as well as previewing the video makes for an easy and enjoyable video editing experience. Even the scrolling across the timeline or through the tracks, all just makes intuitive sense.
The options for rendering videos or even just audio has a straight forward interface that makes it quite clear what is happening when you start that render. Also, when you start the render, you can continue to use Kdenlive. It does not lock you out of the application.
What I Don’t Like
The text editor for title screens is a bit ropey. The cursor indicator isn’t always visible so I often have to make special effort to get to the right location which includes some delete and retype from time to time. The use of it is not as much fun as the rest of the application.
Not so much a fault of the application but doing video editing really needs more screen real-estate. One 1080p screen is not enough. Not the fault of the application but it is hard to see and read everything going on without excessive scrolling.
Kdenlive is a great application with a lot more features than I know how to even use. I don’t do any complex video editing. I don’t have good video equipment so I don’t have a real high level of motivation to create a lot of video content at this time. You can only polish a turd so much and I am often not happy with the video I shoot. I am happy, however, with what I can do with the video in Kdenlive. It does make turning the lack-luster video into barely acceptable video content. Editing with Kdenlive is easy to use and is enjoyable to turn the mess I start with into something more usable. I would like to make more excuses to do more video content because the great user experience Kdenlive provides.
I have heard of people complain that Kdenlive isn’t stable, well, that is a bunch of hooey. Kdenlive on openSUSE Tumbleweed works fantastically well without any crashing. I am very thankful for fantastic packaging and QA process from the openSUSE Project and I am very grateful for every programmer that has had a hand in every piece of this, from the Linux kernel to the Plasma desktop to the application itself. Thank you for all your time and efforts.
Linux Powered Festive Lights
Move from Christmastime to Wintertime. One of those I like a lot more than the other but by having “winter lights” it brightens up the space around me and pushes that gray, cold, dark sadness away. Thanks Linux!
Anything multi-colored and Christmas specific has been taken down. The strands of multicolor lights on my porch have been replaced by blue lights. The wreath and Santa are down but in Santa’s place is an inflatable snowman. Everyting is now white and blue around my house. Not as much fun as Christmastime but I think there is a rule about how much fun you can have at any point in time of the year and I don’t want to over indulge in it. I have to keep it for the designated times, be seasonally appropriate.
I have purchased a few other little things to add to my display. What can I say, I enjoy talking about it. More on that in the future.
Of the three Universal package installers, AppImage is one of them. Historically, it has been my least favorite due to the more squirrely way of managing each application. Essentially, you had these files are scattered about your file system or shoved in a folder some place and if you wanted to put them in the menu, you had to do it manually. When you downloaded an update, because not all AppImages support updating, you had to recreate or edit the menu entry and lacks all sense of intuitiveness. It is just incredibly primative
Some AppImages would integrate themselves into your menu system and even perform their own updates. Most of them, however, do not implement those nice little features. Another step, before launching it, having to modify the properties to make it executable. Not a difficult step but it was another step that made it feel a little clunky to use. Combine all these anti-features together and it was my least favorite Universal package. Sill grateful, just least interested.
Step in AppImageLauncher. This throws a significant change in the Universal Package landscape. I have been favoring Snaps for many reasons: the central repository, the ease-of-use in the command line or graphical tools (I used the command line), automatic updates and vast selection of applications has made it my first stop in the Universal Package route. The next has been Flatpak. It has a pseudo Central Repository, nothing official, it integrates nicely with Plasma’s Application Explorer, Discover. Flatpak has recently been better about automatic updates and does a fantastic job of theming itself automatically to your desktop settings.
Lastly has been AppImages because of the rather ad-hoc nature and disjointed desktop experience they have provided. They would respect your desktop themes and are a great non-committal way to try an application but lacked a convenient and clean way to access them. I have used AppImageLauncher for such a short period of time but it is a game changer as far as desktop experience is concerned. The ease of installation and removal of your application in the menu and the automatic organization makes for a purposefully integrated experience. You really can’t tell that you are using an AppImage unless you are doing a right click in the menu entry. Now, on my openSUSE systems, AppImage is a first-class citizen beside my system package manager (RPMs), Snaps, or Flatpak. 2020 is starting of great in the software world.
So why would you use the AUR?
BDLL Follow up
Something that doesn’t seem to get enough attention is the BDLL Discourse Forum. There is a lot of great discussion going on there, not just because I am dumping everything I am working on there but because it is a great place to get help, talk about your Linuxy experiences and just have great conversation about interesting things in tech.
The Linux Half Top was a thread submitted by Steve (Mowest). He had a broken laptop screen and instead of dumping $100 plus into the machine for a new screen and touch panel, he took the screen off entirely, added an HDMI to VGA adapter. Steve gave credit to another community member Dalton Durst for the idea. It reminded Sleepy Eyes Vince of the Commodore 64 where the computer was in the keyboard and just needed a screen.
The whole idea was brilliant, simply brilliant and was an exercise in problem solving by looking for an entirely different solution. Well done.
I highly recommend you take a trip to the BDLL Discourse for some very interesting discussion, discoveries and ideas.
postgresql10 (10.10 -> 10.11) 59 line item changes applied to PostgresQL
xfce4-terminal (0.8.8 -> 0.8.9.1) Respect the “Working Directory” setting when opening initial window, Fix invalid geometry on Wayland, and several other polishing improvements.
xfce4-branding-openSUSE (4.14+20191207 -> 4.14+20191230) several packages relating to openSUSE branding which included setting the default cursor to Adwaita
libvirt had CVE-2019-11135 addressed
ALSA (22.214.171.124 -> 126.96.36.199) several upstream fixes and UCM and UCMv2 fixes and enhancements. See Changes
NetworkManager (1.18.4 -> 1.22.2) Fix multiple issues in the internal DHCP client, including: wrong parsing of search domains and classless routes options, and failures in obtaining and renewing the lease with certain server configurations.
flatpak (1.4.3 -> 1.6.0) several fixes to include fixing some leaks and not to poll for updates in the portal when on a metered connection.
Catfish (1.4.11 -> 1.4.12) for Wayland and GNOME Shell support
Ffmpeg-4 numerous subpackage updates
SSHfs (3.6.0 -> 3.7.0) to give you higher max connection to improve responsiveness during large file transfers.
Four more snapshots are in the pipeline and at pending stable scores
Computer History Retrospective
I was recently watching an episode of Computer Chronicles that covered the idea of “Simulator Software” recorded in 1983. They talked of the flight simulators of the time, simulations of architecture and urban design. Even in the 1980s they were saving money by doing virtual testing of an environment before you spend the time and money on the real thing.
There was a flight simulator used by the military in the early 1980s that by today’s standards, not so great but if I were running that on an Amiga or x86 based PC in the mid-90s, it would have been pretty darn impressive yet.
It is interesting to see now, the graphics capabilities have advanced. Any one modern graphics card has such incredible graphical capabilities, delivering fantastic realism. It’s something that is pretty amazing if you stop and think about it.
I can’t help but wonder how those ideas were sold at the time to punch information into a computer that by all accounts is not all that capable of calculating the vast sets of variables that are done today. Today, there is so much more that can be done with finite element analysis in software that you don’t have to pay for. Examples of this are, FreeCAD and Fusion360, one an open source application, the second a close source application but free to use for hobbyists.
This is a great episode of the Computer Chronicles if you are interested in seeing the early development of computer simulation in the early 80s. The excitement around it is pretty fascinating and we can thank these people for pushing the technology from which we enjoy the fruits today.
Some time ago I started noodling around the idea of building a replacement server for my home. I wanted to make this an extreme budget build. I came to the realization that I have become rather disconnected with the state of desktop class video cards and really much of anything that was outside of the laptop world. I was hung up, for quite some time on the case and motherboard selection. I would browse Newegg and eBay but since I lacked a lot of information, I was in a constant state of decision-vapor-lock. What changed was when I received some hardware at no cost. An incredibly large case and an AMD motherboard locked in the portion of the project that I was unable to make any decisions and dictated the rest of the build. So, over a period of months, I slowly acquired rest of needed components.
The case, although in good condition, certainly looks like it was at some point out moded and just became a place that parts were thrown into. I would guess this case is as old as my Linux jouirney.
The motherboard that was given to me was an AM3/AM3+ motherboard. I was actually kind of excited about this as I decided I was going to do a complete AMD build. Sure, this is an older AMD CPU socket with a silkscreen date on the board of 2013 but that meant getting something on the cheap was certain. Also, since I don’t exactly buy new things, this fit the bill.
This is what ended up getting, mostly from eBay, so for you to replicate this selection at this price may or may not be possible.
Power Supply – RaidMax RX-1000AP-S Power Supply – $74.19
CPU – AMD FX-9590 – $119.95
CPU Cooler – Cooler Master Hyper 212 Plus – $22.59
Memory – 32 GiB DDR3 1866MHz – $64.95
Storage – 6, Seagate 2TB drives – $149.70
6-port SATA Card – $25.35
USB 3 All-in-one 5.25″ Front Panel Card reader – $19.99
Blu-ray DVD player – $50.00
2x 3.5″ to 5.25″ adapter trays – $8.58
Serial DB9 RS232 9pin com port with bracket – $4.14
6x SATA Cables – $9.48
That made a grand total of $638.87 invested in this machine. I went just a bit overbudget due to the CPU cooler. I was warned that the TDP rating on the CPU meant it was necessary to have an effective cooler.
This was the first time I have actually assembled from parts and pieces a computer. I have repaired and upgraded many but this was the first of this level of DIY. Since every part I had was untested and I had no way to verify if anything was working, as in, nothing else upon which I could conduct individual component testing, there were a lot of uncertainties in this.
When I kicked it on for the first time and had everything working, I was incredibly relieved that it all worked. There weren’t any issues at all with any of the components.
To see this machine actually start up and work in a kind of cobbled together state was not too far short of a miracle. I was very fortunate that all the used hardware actually worked.
Operating System | openSUSE Tumbleweed
There really wasn’t any other choice. I need long term reliability and I am not interested in reinstalling the operating system. I know, through personal experience, that Tumbleweed works well with server applications, is very tolerant to delayed updates and will just keep chugging away.
I have been very satisfied with the stability of Tumbleweed as a server for the last year on my “temporary system” performing that role. The issues I did have with that system, although minor, have been with video due to the Nvidia GPU. This build, I purposely avoided anything to do with Nvidia due to the dubious support they provide.
This was an area that took me several months of research and reading. My criteria was that I had to have Storage Array BTRFS Raid 10. This afforded me a lot of redundancy but also a lot of flexibility. This will allow me to slowly upgrade my dries capacity as they begin to fail.
When deciding the file system, I did a lot of research into my options. I talked to a lot of people. ZFS lost consideration due to the lack of support in Linux. I am perfectly aware that the development is done primarily within Linux now but it is not part of the mainline kernel and I do not want to risk the module breaking when the kernel updates. So, that was a non-starter.
I looked at a few LVM options but if I wasn’t confident in understanding all the details of it and I didn’t want to risk any reliability due to my ignorance. Why I ended up using BTRFS is due to the reliability and flexibility of the file system. Anyone that says RAID 10 on BTRFS is not reliable is, sadly mistaken.
Since the motherboard I have wouldn’t recognize a software RAID and boot from it, I used a 7th drive to bootstrap the whole system. That, also running BTRFS for the root file system and I threw in some Swap as well.
Used a 6-port SATA card for the 6 drives of the BTRFS RAID array and mounted it as /home. At some point, I want to take advantage of the subvolume capabilities of BTRFS but that will come at a later time.
Since this is my new central computer, as it were, I wanted this to have all the faculties for doing the regular nonsense that I conduct in my SuperCubicle. Since it seems I have made a bit of a reputation for doing computer-y things, I tend to help other people out in data recovery, backing up their systems and so forth. I also like to mess with Single Board Computers and although I can stick an SD Card in my laptop. I wanted something with all the media cards in it and external SATA ports for plugging in drives as well. This already had some USB and SATA connections on the top of the case. The 5.25 Media Dashboard has SD, MS, MMC, XD, TF, M2, CF and SATA interfaces. There is also a power connector port and USB3. I have used many of these interfaces already. As a bonus, this has a temperature sensor that I attached to the CPU cooler that tells me what the temperature of that monstrosity is. It really hasn’t gotten real hot yet but I will see how hot I can get it after I really start pushing it.
The optical drive is also getting a regular workout as I have been dipping into the bargain bin of post-Christmas season movies to add to my media collection. All in all, this has been the perfect hardware build for me and my purposes. As it stands today, I only have 3 open bays on this machine so anything smaller, just wouldn’t do.
I didn’t just build this system to look old in my basement. I have had plans for this thing for longer than many of the parts. My number one task is that this machine is my central repository of all my data. Everything from records to movies. To that end, outside of the standard server functions you have by “flipping a couple switches” like Secure Shell, Samba, Syncthing, I wanted to go beyond this. Something “cool!”
Currently testing Emby, PLEX and Jellyfin. This is probably what this machine does most right now. That and ripping the DVDs and Blu-rays I purchase using MakeMKV (Another blathering for another time). This function doesn’t seem to be very taxing on memory or processing power. Maybe if I had more machine drawing media from it it would but that is not an issue at this time.
Although I am not exactly doing much gaming, I think I played a game of River City Ransom: Underground with my youngest. I have also played Descent 2 (rebirth) on this machine, and it, of course, ran it extremely fast. At this point, I haven’t come close to taxing the video card. I am planning to do more Linux gaming with it and by that, I mean, anything that I can run in the Linux environment, so Wine and Proton, those are also fair “game”.
Since this is the most capable machine I own, I’m using this to render video. It does the task in 1/3rd the time of my Dell Latitude E6440. Would faster be nicer, sure, but I don’t exactly churn out lots of video content for it to matter. I still tend to edit the video on my laptop but render it on this machine. Mostly because I don’t have great monitors for it yet. That will come later.
I will be implementing a Nextcloud server and start playing around with some note taking applications that I can self-host. Not that I am unsatisfied with Simplenote, I just happen to like to keep my options open.
Another service I want to run is Home Assistant. I have these plans for implementing “smart devices” that are not cloud based going off someplace else. I want to have Home Assistant, manage all my devices and make my home just a bit more convenient. That is also another blathering for another time.
I had originally intended to make a video of the build of this, to include the installation process, but after reviewing the video and being bored out of my mind watching it, I have kicked that to the curb and will maybe turn that into an 1980s sitcom montage to music or something.
Although this computer has only been up and running for about two months, I am slowly adding more services and functions to it. For now, it is pretty light, but in a few short months, that will most certainly start growing. I am very happy happy with the sub-$700 build for a computer system that has met or exceeded my expectations. It was a fun first complete, from ground up, scrap-together assembly that really was a gamble. I am pleased with how well openSUSE Tumbleweed runs on it and that I have had no disturbances with any operating system updates.
Often, after a project, you will review it, have an “After Action Review” and ask yourself, “What would I do differently if I were doing this again.” I can honestly say, there is nothing I would change. I like everything about this machine. I would, perhaps, like more storage space as I have already gobbled up 2.5 TiB of my 5.5 TiB of storage space. Reviewing what I spent and the additional cost of the larger storage, I would have still made the same decision. So, back to would I change anything? No, I think I made the right decision. I do have upgrades planned for the future but that is a project for the fall. This machine truly fits my needs, even if much of the hardware is yester-years retired bits.
Fusion 360 is a CAD / CAM application with finite element analysis capabilities. I was going through the Autodesk forums and read a lot of chatter about their position on the Linux client. It appears that for several years, there have been requests but there is no plan to support it.
One user gave a fantastic well thought out, logical reason for building Fusion 360 to work in Linux and he gave the typical reasons for not doing so with answers:
the management sees not enough customers here. It’s a question about cost/income ratio.
I think if done right, there are not much costs (keyword continuous integration)
Number of potential customers. Linux users need to raise there hand and write to Autodesk, so that they can see, there are potential customers. Linux leads already on the server market, and on embedded devices, smart phones and tablets (if you count Android as Linux).
On the desktop, Windows is still the dominating system (88%), Mac (9%), Linux (2%). But this is for the average user, this doesn’t need to be true for engineers and makers using CAD software.
I have no statistic here, but I personally have never seen engineers working on Mac. But I have seen many engineers, software developers and scientists that work on Linux.
Linux users are willing to Beta test and are able to generally figure things out for themselves.
There were a lot of hostile responses from Windows users that were just… hostile. I do think that is a large part of the untold story. There are those that point to Linux and talk of the technological elitism but I don’t think that is a behavior that exclusive to Linux users at all. I can refer to this post for evidence otherwise.
Even though Autodesk has stated that they have no plans to support Linux, it is always with the caveat that of “at this time.” I still have hope that Linux will be supported in the future. It’s inevitable as there are a larger percentage of Linux users in the engineering field, Autodesk does support Linux on the Maya application and since there are more and more professional tools on Linux, I truly believe it will follow.
It took me far too long to complete the write up and video but I must say that the tiling features in Plasma are pretty fantastic. I spent this past weekend doing a lot of administrative work for another job of mine and the tiling manipulation of windows and desktop navigation made the tasks far less painful than they have been historically. I have to emphasis once again that it is important to have key combinations that make sense that are easy to remember that can are quickly intuitive to you.
I made a little video about this with Kdenlive and put it on YouTube. I had a less than stellar comment about my production quality. For that, I can say, I’ll try better next time.
I did a post this last week on my use of Linux in the kitchen. I did appreciate a lot of the great feedback II received from this. I don’t want to understate, at all the value of technology in the kitchen. It is not at all a strange science experiment being shoe-horned into a role in which it doesn’t make sense. Linux and the array of tools make several kitchen tasks more efficiently completed.
For my case, the right hardware was an important part of the implementation as I have a very limited amount of counter space. There were already several software applications I had been using, I just happen to further expand how I had been using them.
How it recently made the Christmas season more efficient…
What would improve Linux in the Kitchen is going to take some real effort on my part. Most of these things will be aided by single board computers or IoT like devices. I need more metrics in order to improve my results when baking. Improved inventory management, improved meal planning. All but the last one will take some serious work and effort in order to implement.
BDLL Follow Up
Fedora 31 challenge. Lot of people were rough on it and in some ways I understand but in others I do not. I have used Fedora periodically and I have always found it to be an enjoyable experience. Fedora is a lot more like getting a Lego set with some instructions than it is a ready-made product. I look at Fedora as being a more industrial grade Linux system that you implement for a specific feature. While distributions from the Ubuntu flavors are more like products that are ready to be used that focus on the out-of-box experience. All the flavors of Linux have a place and a target audience. Everyone is entitled to their own opinions about a distribution experience but I think it is almost a bit unfair to evaluate Fedora in the same way you would evaluate an Ubuntu.
I have decided to use Fedora’s Plasma edition and I am going to give it a fair, but biased, review. My expectations are very focused. I don’t need the “last mile” type polish, nor do I expect that from a Fedora or an openSUSE for that matter. What I do expect is something very easy to work with and mold to my wishes.
openSUSE does a great Plasma. I don’t mean out-of-the-box perfect for my needs. No distribution should ever target me as the core user, that would be tremendously silly. I am an edge case and I am never satisfied, I am a moving target of requirements and expectations for what I want as my personal workspace. I would be a high maintenance target for a perfect out-of-box experience.
wiggle (1.1 -> 1.2) a program for applying patches that ‘patch’ cannot apply due to conflicting changes in the original. Wiggle will always apply all changes in the patch to the original. If it cannot find a way to cleanly apply a patch, it inserts it in the original in a manner similar to ‘merge’ and reports an unresolvable conflict.
bubblewrap (0.3.3 -> 0.4.0) The biggest feature in this release is the support for joining existing user and pid namespaces. This doesn’t work in the setuid mode (at the moment). Other changes include Stores namespace info in status json, In setuid mode pid 1 is now marked dumpable also now build with musl libc. gthumb (3.8.2 -> 3.8.3)
gnome-shell (3.34.2+0 -> 3.34.2+2): polkitAgent, Only set key focus to password entry after opening dialog. The keyboard now stops accessing deprecated actor property. libnl3 (3.4 -> 3.5.0) * xfrmi: introduce XFRM interfaces support xfrm: fix memory corruption (dangling pointer) mypy (0.720 -> 0.750) More Precise Error Locations and the daemon is No Longer Experimental python-Sphinx (2.2.2 -> 2.3.1) python-Sphinx-test (2.2.2 -> 2.3.1) python-jedi (0.15.1 -> 0.15.2) python-mysqlclient python-parso (0.5.1 -> 0.5.2) python-pybind11 (2.4.2 -> 2.4.3) python-typeshed (0.0.1+git.1562136779.4af283e1 -> 0.0.1+git.20191227.21a9e696)
wireshark (3.0.7 -> 3.2.0) bug fixes and updated protocol support as listed
Firefox (70.0.1 > 71.0) Improvements to Lockwise, integrated password manager, More information about Enhanced Tracking Protection in action, Native MP3 decoding on Windows, Linux, and macOS, Configuration page (about:config) reimplemented in HTML, New kiosk mode functionality, which allows maximum screen space for customer-facing displays. Numerous CVEs were addressed relating to memory.
I think we often take for granted the multimedia capabilities of computers today. It seems like someone is always harping about PulseAudio on Linux. I’d say they are likely not using the right distribution, by that I mean openSUSE, I don’t have these issues. The purpose of the section is not to tout the superiority of my favorite operating system when it comes to audio subsystem, rather, it is to talk and reflect about how great we have it today with all things audio on modern computers.
In 1983, the state of digital music was not as rich as it is today. We can enjoy a virtually endless supply of content never before available in human history. Let’s go back in time to an era when the Commodore 64 was the pinnacle in home computer audio. Where audio was entirely programmed, limited to 4 wave forms of sawtooth, triangle, pulse and noise. A multi-mode filter featuring low-pass, high-pass and band pass outputs and three volume controls of attack / decay / sustain / release (ASDR) for each audio oscillator and a few other things I barely understand. Regardless, the capabilities were limited and synthesizing voice was an incredible undertaking that took years of work long after the chip was in the wild. This was one of the first polyphonic sound chips on the consumer market that, to this day, is held in high regard and many still like the sounds this chip produces.
All this said, this was very interesting record of computer generated music that is certainly worth a listen. I find the experimentation and musical education tools used in this perod incredibly fascinating. Today, things are very different. Musical composers and artists use computers in music production and to do so otherwise would likely be considered insane. I now wonder if individuals in the 80s that pushed the art and science of computers in music were considered insane by their peers.
Post Christmas Day shopping yielded me a really nice find, specifically something pretty fantastic from Lowe’s that allows me to fix my AC light strands. A Holiday Living Light Tester. The directions could have been a bit more clear… maybe worth a video… but I was able to recover three of my LED bush nets. Since they retail for about $10 each, that has made the purchase worth it already. This device is supposed to work with LED as well as incandescent lights. I’ve only tested it on LED thus far and it works well.
This is a device that I wish I had discovered long ago.
Christmas Lights Sequence to Music with xLights
Very comprehensive software that allows you to look at the wave forms, change playback speed and make it easier to adjust the actions to occur at the right time. I’ve only began to scratch the surface of the power and capability of this and the reality is, I don’t know what I don’t know on using this software. My set up is really quite simple, therefor I can’t take full advantage of its capabilities.
Some of my favorite effects to date are the butterfly, marquee, fireworks, life and fan. They currently give me the visual excitement for which I am looking to put into the sequences.
There are many more effects to discover but due to the limited nature of my display as it currently is, I can’t do some of the more fancy enhanced items, yet.
I recorded a two videos an posted them to YouTube, they are nothing terribly special, but I am quite pleased with how it turned out.
Funny aside, I went to record the second sequence and there was a car parked in front of my house, waiting to watch it.
I did decide to employ an FM transmitter so that people can listen to the music in their vehicle but I don’t actually have a sign to inform that fact.
The old boy on the block that is well known. I haven’t used or tried it yet but this is still the one I hear the most about. Because it is the popular one, I tend to go for other things… for reasons unknown
This will be the next version I try. I have noticed that they do have a Docker image so I am going to take this as an opportunity to learn some things about docker while I’m at it. The key feature of this one is it is completely open source and that has a great appeal to me.
This is the media server with which I started this journey and am currently testing. I planned to test the others already but I have been engaged in other matters. It has decent name recognition but did go closed source after they gained some momentum. I have been using this for about a few weeks and the features I like are that it works much like you would expect in Netflix. If you activate notifications, you’ll be notified about a “new release” when you put something in your repository of media. I thought that was kind of cute. Setting it up is pretty trivial and I will be doing a write-up on this as well.
I want to do reviews of each of these media servers with my openSUSE Tumbleweed based workstation / server and see how it goes. Really, there is enough horsepower, I can have all three running and see how each of them, play out, as it were.
Restoring my Nexus 6P To Working Order
As a kind of Christmas gift to myself, I spent the 5th day of Christmas disassembling and installing a new battery into this phone. I shelved the project in August but didn’t put it out of sight. Seeing it almost daily, I’ve had it gnawing on me to get it done and I finally did it.
I bought a battery replacement kit on eBay for this phone that had most of the tools I needed. I had no interest in doing a tear down video as there are plenty of those on YouTube. YouTube Video demonstrating battery replacement of the Nexus 6P. Although the repair of the device was rather annoying and tedious, you know, just difficult enough to scare off smarter people than me, the part that took me the longest was updating the phone and installing LineageOS with everything working.
There was only one issue, really, working cell service. The problem ended up being that the was a security lock out that prevented the SIM from being accessed and disabling it is what ended up fixing it.
As we wrapped up the year in BDLL challenges, our task for this week was to make some predictions about the year 2020. They didn’t have to be Linux related so, exactly but since Linux and tech is the focus of the show, it would only make sense to keep it as such.
What I am wishing for, in 2020, is commercial grade CAD / CAM, manufacturing technology software to come to Linux, not necessarily for home use but for use in business.
Specifically, what I would like to see is Fusion 360 by Autodesk supported in some level on Linux. It already runs well in Linux through Lutris but having actual support for it would be fantastic. I would also like to see PTC’s Creo running on Linux. PTC once supported Linux with earlier offerings of their mechanical design package but no longer do so today. It would be great to see.
Aside from bug fixes, removing dependencies that are not needed, here are some of the highlights of the last six snapshots
Rammina, an rdp client to version 1.3.7 which included improvements to translations, better authentication MessagePanel API, Printer sharing improvements, and various bug fixes
NetworkManger, updated to 1.8.25+20. Applet scales icons for HiDPI displays.
Bluez, the bluetooth stack, received a version update to 5.52. Fixed AVDTP session disconnect timeout handling, disabled one more segfaulting patch, and fixed numerous issues.
KDE Plasma updated to 5.17.4. Discover Fwupd will no longer whine when there is unsupported hardware. Improvements to KWaylend integration, and numerous other fixes and improvements.
GNOME Desktop was updated to 3.34.2 which has undoubtedly further improved the experience for it’s users.
GTK3 updated to 3.24.13+0
Gstreamer Plugins, updated to 1.16.2. Fixed numerous issues in the v4L2video codecs
Wireshark updated to 3.0.7 which addressed CVE-2019-19553 CMS dissector crash
Akonadi has been updated to 19.12.0 There weren’t any features added but improvements and bug fixes were implemented.
Wireguard updated to version 0.0.20191219 that added support for nft and prefer it, and fixed other various issues.
YaST updated to 4.2.47, bug fixes and refinements to how it operates
php7 updated to 7.4.0 where systemd restrictions for FPM were relaxed and other various improvements
Tumbleweed Snapshot Reviewer gives 20191210 a stable 99; 20191211 a stable 99; 20191213 a stable 91, 20191214 a moderate 90; 20191216 a stable 96 and 20191221 a stable 98.
This is a new segment I am going to try out for a few episodes to see how it fits. Since I am vintage tech enthusiast, not an expert, I like looking back and seeing the interesting parallels between the beginning of the home computer or micro-computer revolution compared to now.
The Computer Chronicles is a program that spanned for 20 seasons, starting in 1983. The original hosts, Stewart Cheifet and Gary Kildall’s first episode focused on Mainframes to Minis to Micro computers and it was such a fascinating discussion. Stewart Chiefet asks Gary, right of the bat, if he thinks whether or not we are at the end of the line of the evaluation of computers hardware or if there major new phases of this evolutionary process.
Gary responds with “no” and saying that they are getting smaller, faster and less expensive. He speculated that they will get so small you will lose them like your keys.
Couldn’t help but think if Gary was still alive today, how many times would he have lost his cell phone today and would he think back to those words. I know that I lost my cell phone in my house, the one I just fixed three times.
Watching the demonstration of the TX-0, the first transistor powered computer give a demonstration was quite fascinating.
The Super computer from the 1960s filled entire rooms while they experimented with parallel processing In the 1970s, computers miniaturized to Something resembling a single server rack and were called minis and were considered portable because they were on wheels. The late 70s and into the 80s, micro-computers came into prominence and although substantially cheaper the Mainframes, Minis and Micros, still far more expensive than what can be picked up today.
I found this particular episode very interesting due to the excitement of how small computers were getting but by today’s standards, really quite large. The hunger for speed was just as apparent in 1983 as it is today in 2019… almost 2020.
The micro-computer they demonstrate here is a Hewlett Packard HP-150 which was an attempt at being user friendly with a touch screen interface. Nothing like the touch screens of today as it uses infra red transmitters. It is noteworthy that in the demonstration of the machine by Cyril Yansouni, the General Manger of the PC Group at HP, it was stated that the most intuitive tool to interact with the computer is your finger. That holds true today, looking at how people interact with tablets and mobile devices. The interaction seemed rather clunky by today’s standards but I think it is pretty cool to see the innovation of the time. Mr. Yannsouni also stated that he doesn’t think that this alone is the most ideal interface. He stated that he thinks that there will be some combination of touch, keyboard, mouse and even voice that will be something more idea. I think he was correct on this. This machine, the HP-150 has a kind of goofy look about it but at the same time, pretty cool as well. I’m really glad it was demonstrated.
The direction that was being discussed here was the future of computer technology. Herb Lechner stated that the future will be networking computers together through local area networks so data can be shared. Gary Kildall and Cyril Yansouni speculated, very excitedly, that the data communication will be over the phone system as the future of networking because local networks are too expensive and difficult to set up. I wonder what they would say today about this.
What I really learned from this particular episode is that, one, our desire for smaller, faster, better computers hasn’t changed. There was experimentation on form and function of computers with what the best of technology had to offer for the time and there was lots of fragmentation, far more than anything we have today. I also learned that most of the experts tend to be wrong about the future of technology, that hasn’t changed today either.
2020 is on the horizon, and to quote my favorite fictional character of all time, Doc Brown, “the future is whatever you make it, so make it a good one.” Make 2020 the best year you can, be kind to one another and should things not go as you planned, don’t hold any resentment against yourself or those around you.
As a kind of Christmas gift to myself, I spent the 5th day of Christmas disassembling and installing a new battery into this “shelved” phone of mine. It is something I have wanted to do since the battery started fading and I finally got to it.
I bought a battery replacement kit on eBay for this phone that had most of the tools I needed. I am not going to provide you a tear down video, there are plenty of those on YouTube and if you are interested in that, click here. This will tell you everything you need to know and possibly more. I am going to focus more on the ares of difficulty and the installation of LineageOS.
Pixel was an okay phone but was a bit too small for my hand, I didn’t like how it fit in my phone holder in my truck, the battery didn’t end up being much better on that phone after about 6 months of use and I couldn’t put LineageOS on it because it is locked down.
The video gives you a list of tools to use to do the repair. I didn’t have everything, exactly as they suggested. I grabbed my whole kit of tools available and this is what I ended up using:
Plastic triangle opening tool of two different thicknesses
Tweezers. I used whatever tweezers I had in my tool box which ultimately came from the bathroom medicine cabinet. I would recommend a better set but something is better than nothing
Box cutter. I didn’t have a precision knife set as per recommended in this video and I would highly recommend something like that and I won’t do another repair without it. The box cutter worked but that is a little like using a sledge hammer when all you need is a 16 oz claw hammer. Sure, it gets the job done but makes a bit of a mess of your project surface.
Paper clip in place of a sim card ejection tool.
Heat gun. Mine was probably overkill but it worked fine.
Small Cross-recessed (Phillips) screw driver. The battery kit came with screw drivers but I prefer my nicer, more professional set. Even I can show up to a party in the right outfit from time to time…
The video recommends playing cards but those were chewed up pretty quick on me so I had to use some more ridged cardboard to slide between the battery and the body of the phone. Your mileage may vary. In my case, this “Hello Fresh” junk mail bit worked better than a playing card. Basically, anything ridged that is not so stiff as to crease the battery and cause it to vent with flame.
I used a spudger but not a fancy black nylon one, this one was able to pry and get between the frame and the screen well enough.
Double sided sticky tape to put the lower back panel back on
I also used a dental pick to help with the picking at the device. I recommend something like this for so many of your smaller projects, especially if you have giant sausage fingers.
1 hour of time to devote to the project
The two areas of extra care for this project is removing the glass around the camera and the battery.
The glass needed to be loosened up with the heat gun, gently, as to not over heat the device. Doing so can cause irreparable damage to the device. Once I got this portion heated up enough, the glue started to let go of the glass plate enough to allow me to get that knife in there. This would have been easier with smaller, more precise tools. Thankfully I didn’t break it.
Once the glass and the plastic cover are removed, that will expose the 6 screws holding the device together.
The spudging tool will be required to carefully pull the body of the phone from the screen assembly. I was “fortunate” that this phone has a bit of an area of buckling around the volume button and made it a bit easier to get the case away from the screen assembly.
The phone comes apart and exposes all the little secrets of its design. It also exposed the fact that this thing is incredibly dirty and needed a good cleaning with some isopropyl alcohol.
The other area of concern is the battery. It is imperative you take extra care as to not bend the lithium-polymer battery too much or this will “vent with flame” and it can be a rather spectacular event, one that I was not interested in having. There is a small amount of clearance that will allow you to start prying away at this battery. I carefully used the heat gun to loosen the glue here as well. Once I got the battery up a little, I used several card like things to pry this up.
The rest of the instruction per the video was spot on but the emphasis on the glass and the battery was a bit understated, from my view.
Reassembly of the device was pretty straight forward. It assembles pretty easily. Install the battery, I reused the adhesive pads from the previous battery. Then carefully install the cables, make sure they are fitted well. The glass still had enough adhesive on it that it will keep the glass in place. The bottom plastic bit needed some double-sided tape to keep it in place.
Since this phone wasn’t exactly a “looker” when I started, I am not concerned about how it looks when complete. It is also in a case that will help to hold things in place.
Upgrading the Operating System
This was actually a lot more time consuming than fixing the battery, I am sorry to say. When I started the phone I was greeted with this error about a vendor mismatch.
I have seen this error before so it wasn’t a big deal in fixing this. I downloaded the Google image and flashed the vendor image as per the instructions I found here. In short, here is the process I went through:
Downloaded the nightly ROM and the Gapps (mini Gapps) and put them on your phone
fastboot flash radio radio-angler-angler-03.78.img
fastboot reboot bootloader
Flash the Lineage ROM and the mini Gapps
Wipe cache, reboot
I admit these instructions are not as verbose as I normally give. If you have any issues, please leave a comment or email me and I will take the time to make it more verbose.
After this process, the phone would not recognize SIM for cell service. I tried flashing the radio and vendor image and still, nothing. I used this little trick from here which also didn’t help, it only told me that it didn’t know the IEMI.
That trick is, on the dial pad, Type in *#*#4636#*#* It exposes some very interesting bits of information about your phone.
I reinstalled using these instructions several times. Service mode didn’t provide me any solutions and I feared that I somehow erased the very definition of the phones cell radio identification.
As a kind of last ditch effort, I installed the stock Android image and the cell phone signal miraculously worked again. Installing Lineage OS once again left me with no access to the radio. After some more web crawling, the solution sort of come from this Reddit post that said the issue has something to do with the system lock not releasing the cellular radio to the system.
Sure enough, after disabling all the security features, and rebooting the cell service works once again. The specific issue is with this “Secure start-up” where it requires a bit before starting the system. There is some kind of bug in this that is causing issues. Where exactly, I have no idea.
In order to implement this solution, to disable the “Secure start-up” feature and prevent the SIM from being locked out. Go to, Settings > Security & Privacy > Screen lock. I prefer a pin lock screen and when you do enter your desired pin, you are asked for your “Secure start-up” preference. Say “No”, rebooting the device and the cell service will work normally.
I have installed all the important applications and I am back to full mobile capacity… which… is a pretty short list, really.
Phones today are, frankly, terribly designed. The process to replace the battery is unnecessarily tedious. At this point, I would consider any phone without a user accessible battery a terrible design and I will not purchase another phone that locks away a battery. That signals a design with planned obsolescence. All that does is encourage greater levels of e-waste. I have great hope in the up and coming PinePhone that may not have the performance capabilities of a modern “flag-ship” phone but no matter how much it may lack in processing power, storage, or RAM, it does have a replaceable battery. That means it won’t be a turd of a design that you get from the likes of Apple, Samsung or Huawei.
LineageOS is now a must to have a good Android experience. I tried to go several months on Google-locked Android and frankly, that is not a good experience. The lock down of applications on the phone is terrible. I should be able to remove whatever applications I want. I have reaffirmed that I will not purchase another locked mobile device, newer does not mean better and stock Android is vastly inferior to Lineage OS Android. It’s not even a fair comparison on the significant user improvements the Lineage team puts into Android.
Ultimately, I look forward to the PinePhone. To have an unlocked, user serviceable device that may be a bit less capable on raw performance is a welcome upgrade to just about any mobile phone out there. Give me a headphone jack and access to my battery! I am now done with these mobile nightmare devices.
I am not one to turn up my nose to old technology and I typically am excited about anything a little bit older or vintage to explore. In fact, I am generally excited to take a screwdriver to just about any piece of technology out there. I will say, there has been a recent exception.
I was brought a computer to extract some pictures and such off of it to put on a flash drive. It is a Pentium 4 Compaq which means it is a 32 bit machine. I am sure hasn’t been turned on in a long time. I am guessing 6 years or greater. I do remember setting this computer up years ago with openSUSE Linux but I didn’t have the root password for it. Since there was some sort of file system error that fsck wouldn’t correct and I didn’t have root access either, so that made it problematic as well. If you are thinking it was a BTRFS problem, you are thinking wrong. It was XFS that had an issue as this was before openSUSE started using BTRFS on root.
I took the side panel off of the machine to get the drive out, but try as I might, I was not able to remove the drive from the inside. There are fasteners in the side of the drive that are not accessible but in a kind of track.
So, I decided, I would take it out of the front of the machine. after some prodding and probing, I was able to get the face of this derelict machine off and finally be able to remove the thing. The 3.5″ PATA (IDE) drive sits right below the 3.5″ floppy drive. Removal of the drive was now trivial. The plastic retainers just had to be pressed on the side of the drive enclosure and the drive slid neatly out of the front of the machine.
I had to dig into my storage bin of hard drive related components and I pulled out an IDE to USB adapter. The first one didn’t work, nor did the second, the last one I pulled out was able to actually read the IDE drive and I don’t have any idea why this was a problem. I have used the adapters for years recovering data from these old drives, however, the last time I did such a thing was 2012.
Pulling the contents of the data from the drive took an incredibly long time, much longer than I expected. Transferring 74.1 GiB of data over a PATA interface with a maximum theoretical speed of 133 MB/s really demonstrated how spoiled I have become with SATA drives and SSDs. I walked away and worked on other things due to my lack of patience so the actual time it took is unknown to me. I suppose I could do the calculations…
Using this site here, Calctool.org, it tells me that it could have taken no less than 70.8 minutes. That is probably about right.
After I transferred all the data locally, I exported the pictures and such to three USB flash drives to be used on whatever computer they wish. The question remains, what do I do with this machine? I could put something 32-bit on there just to see how it would work but the question is, which one? The top contenders for me are openSUSE, MX Linux, BunsenLabs and PuppyLinux (some variant). Maybe I’ll let one of my kids do it as a learning exercise.
I can seldom resist the urge to play with technology, it is a weakness. Basically, as long as the request isn’t, “can you install a non-Linux operating system on it” I am all about it. Recovering data can be a fun project, although, admittedly, this was less fun than other machines due to the obstacles in removing the hard drive
I find it remarkable how fast the years of tech seems to be flying by. It seems like only yesterday that PATA (IDE) was the standard on everything and I didn’t have any complaints about disk speed when it was the standard. Now, using that fifteen or more year old drive, just for the process of removing the data, was so much slower than what I remember, or maybe I am becoming less tolerant of waiting for my technology. Either way, as much as I like vintage tech, I do appreciate many of the new standards, like SATA, because not only is it faster but has a more robust connector… and the more I look at it, I see how it resembles the edge connectors of old.
It is also worth noting that the transfer speeds of PATA drives theoretical maximums is slower than what many have as an internet connection speed. Something to think about.
I purchased a medium of the road Bluedio headset that I have been using in both Bluetooth and wired modes. It’s pretty decent and they fit my head well. Unfortunately, my Magilla Guerrilla handling of it I snapped the headband. I didn’t think I was being rough with it but I do have a track record of such things. The break was on the left side near the slide out adjustment and although the set was still wearable, it felt like one ear cushion loose enough that it would slap the side of my head at every turn.
I had three choices, buy new headphones, deal with it and get used to the gentle paddling of my left ear or lastly, fix it and see if I can return it to an acceptable, usable condition. The paddling was completely unacceptable to live with and the the headphones would no longer fit snugly to my head so this option was ruled out. The option to buy something new was also out. My budget had already been allocated and I am not interested in getting new hardware when these were still electronically functional. Why wouldn’t I at least attempt a repair?
The padded headband was well stitched together in such a way that the stitching was easy to delicately remove. This exposed the poly-carbonate (I am assuming) structuring beneath.
Looking at it, the fix wouldn’t be difficult at all to do it, with the right combination of adhesive chemicals: Loctite 444 Ethyl cyanoacrylate liquid adhesive along with Loctite SF 7452 cure-speed accelerator for the aforementioned adhesive.
The nice thing about the accelerator (a trick I learned at work), you can add adhesive and immediately follow it with the accelerator to layer on material and consequently, greatly increase the strength. This was a technique demonstrated when I fixed my broken Porter-Cable Drill some time ago.
Just a few minutes of gluing and applying the celebrator, had extended the life of this headphone set. Would a normal upgrade to something new and better? Probably but that is just not how I roll. I can’t bring myself to toss out something that is easily repaired. I have yet to sew the padded headband back together but I am no longer getting paddled by the ear pad and when I do handstands, they don’t fall off of my head.
This is not an advertisement for Henkel but fixing toys or equipment is easily accessible to just about anyone as long as you have these two chemicals. It opens up a whole new world of fixing possibilities. I have seen YouTube content creators struggle with gluing broken bits together, clamping them for hours at a time when the job can be done in a fraction of the time. Sure, these are not the cheapest of products but they are extremely effective and drastically reduces the likelihood of your project ending in frustration.
Recently, my Linksys E2000 decided it would no longer be the wireless access point I expected it to be and it had to be replaced. Thinking that maybe it just needed an update or to be reverted to the original firmware also did not solve the problem as it would just not allow any clients to access the network. No matter what I did, there was no way I could get this thing to work properly. It was time to replace it. After doing some reading and digging but ultimately taking the advice of my e-friend Mauro, I purchased an Aruba IAP-105.
The WRT54GL I pulled out of storage just wasn’t cutting it, throughput wise, even though Wireless G was pretty great some 14 years ago.
This is a nice little device and it feels like a well built unit. While handling it, the look and feel of this well crafted equipment feels like something that shouts at me “professional” or perhaps, “I was built to survive knuckle-dragger handler like you.”
Reset the router done by inserting a paperclip into the recessed hole when off and turning it on. Wait about 5 seconds for the LED indicators to flash and you are off to the races. Note that just pressing and holding the reset button does nothing when it is on.
The Access Point presented a login screen and I was unsuccessful in locating anywhere in the instruction manual the default username and password. It took a bit of digging but I was able to determine that the default username is admin and the password is also admin. I was sure to fix that default as it has been shown far too often that the defaults are left and a network is compromised.
Setting up the Access Point was so simple that it took me a bit to realize I had it set up properly with very minimal effort on my part. The effort was so minimal, I was convinced it wasn’t set up properly until I started to see the clients connect. It was amazingly easy.
Under the Network section, select New to enter a New WLAN. What is interesting here is that you have 3 options. Employee, Voice and Guest. None of which are exactly my use but home use is probably closer to “Employee” than Guest.
Next was the Client IP & VLAN Settings. In my case, I have no VLANs on my network. Maybe I should but at this time, I don’t see a need. For my purposes. I want the Client IP assignment taken care of my main DHCP server and since I don’t have a Virtual Controller, I went with the “Network Assigned” option as it seemed the most reasonable. The client VLAN assignment was left at “Default”.
The Security section was straight forward
Nothing to do with the Access section.
Once I completed it, I was a bit confused because I didn’t set the DHCP server or the DNS or anything. I wasn’t sure if I had missed something so I clicked around a while, only to discover that it took care of all of that for me.
The client info provided by the access point is very interesting. Graphs on the signal strength, connection speed and throughput of the connected devices is very interesting to see. Now, should I have issues with a client, I can look at the graphs and make a better understanding of what the issues may be. It could help me to choose a better location for the IAP in the future.
I do want to add a note that I am getting a warning that I only have 100 Mbit/s link speed on the ethernet. I am thinking this has something to do with the PoE I am using as my switch and everything connected to it is full 1 Gbit/s. A bit irritating but I will circle back on that eventually.
Once again, my network feel solid and strong. I am very happy with this purchase and buying it on eBay for about $20 made it all that much better of a purchase. The set up was far more simple than I expected and I am strongly considering getting another one so that I have access points on opposite ends of the house.
I am incredibly satisfied with this purchase. The network connection in my house is very strong and although I am slightly annoyed by the Ethernet speed, it’s probably my fault some how and I am going to work that out later.
I hate to say this but Linux software is not perfect. I know, I know, but nothing could possibly be wrong with openSUSE, right? Well, Linux and all the open source tools are created by people and since we are flawed, so are our creations. Sometimes, things can slip through the quality assurance process at openSUSE and however rare, they do happen.
One of my problems that has shown it’s ugly head is an issue with the wifi driver. Sometimes, for whatever reason, it cannot authenticate. Another situation is, sometimes, you may have an issue passing a device to a Virtual Machine and it doesn’t come back quite right.
In short, if you have a device on the PCI bus that needs to be removed and added again, there are some ways to do that. To get the PCI device ID, run:
Take note of whatever your troublesome device is from here.
echo "1" > /sys/bus/pci/devices/$NUMBER/reset
This should reset the device and have it behave, but as you may know from your experience in having used the original Nintendo Entertainment System, sometimes, it just isn’t good enough.
The sleep 2 is only necessary if you are copying and pasting into the terminal or creating a script. It is just a pause before it rescans the PCI bus. How I used it and I did create a script for this that I can invoke if I have problems.
Software isn’t perfect, I have historically had issues on more than one distribution with PCI devices requiring a reset. This method works with openSUSE Tumbleweed in the year 2019. If this should change, I will update this post.