I have historically made my hardware decisions based on price, generally I get what I can get for as low or as reasonable as possible. Basically, I go for free or near-free and fabri-cobble something together. After seeing some other computer setups, I have really thought that I want to be able to function more effectively and efficiently than I had been. One of the areas that I have been less than happy has been my monitor layout. I have been pushing 3 displays with my Dell Latitude E6440 and for the most part, it has been meeting my needs but there were some work flows that have not been working out so well.
What I had before was a kind of ah-hoc solution. I started with one monitor than wanted more screen real estate so I placed it off to one side because that is just what made sense at the time.
What I had here was a laptop screen with 1920×1080 (FDH) resolution. A monitor directly above with a resolution of 1440×900 (WXGA+) and off to the top right a screen with the resolution of 1280×1024 (SXGA). Both of those monitors I purchased for $10 each from a company upgrading everything. I was pretty happy as going from one monitor to a second was fantastic and adding a third made it even better.
The problem I ran into was that the monitor above was not Full HD and sometimes it made for some usability issues with certain applications. That was compounded by having a monitor to the right with a physically slightly taller display but pixel wise, quite a bit taller and it just made things weird when moving from monitor to monitor.
The solution presented to me by my e-friend, Mauro Gaspari is ultimately what I started to pursue when he sent me a picture of his screen setup on Telegram. What he had (probably still has) is a 1440p monitor. I had never seen such a thing, it was so clean and made so much sense, especially with the ability to tile windows. So, began my search and measuring to see what was feasable. Fast forward to about eight months later, I purchased the LG 29WK50S-P. This is a 2560×1080, 29″ with a 60Hz refresh rate.
Initially I wanted to go with a 3440×1440 (WQHD) screen but I couldn’t get one at the size and price I wanted. Since I don’t have a whole lot of space and the distance it will be away from my face, any bigger than 29″ diagonal would take up too much space. I also didn’t want to spend a whole lot so what I payed was $179.10 for this monitor and I am quite happy with the price. Sure, more than the $10 I spent on the last monitor but a heck of a lot more pixels.
The description of this this monitor is a 29 Inch Class 21:9 UltraWide® Full HD IPS LED Monitor with AMD FreeSync. It has the following features
AMD FreeSync™ Technology
Dynamic Action Sync
Smart Energy Saving
Screen Split to give you different picture choices with the monitor.
None of these features were all that important to me. What I was most concerned about was the resolution and VESA mount. The split screen feature, to which I mostly don’t care about, is intriguing as I could use the second display input and do some testing on other distributions with another computer.
I really wasn’t asking for much in a monitor, really. I am going to take advantage of the AMD FreeSync at this time either but it is nice to know it’s there.
I have been spoiled in openSUSE Linux for years and years. I haven’t really had to fiddle with anything to get my computer to use hardware. I expected this ultra-wide monitor to be just as un-fiddly but it wasn’t. For whatever reason. The display didn’t recognize to computer its proper resolution.
I don’t know why if it is because it falls under the “other” resolution category or if there is some other issue. I am running Tumbleweed so I do have the latest drivers and since this monitor has been around for a while, I wasn’t expecting any issues.
The Plasma Display Settings didn’t give me the option of 2560×1080 at all, a quick DuckDuckGo search which brought me to the solution to my troubles here on the openSUSE forum. I started out by using some “old school” xrandr commands.
Then I sent the command to change the mode of the screen
xrandr --output HDMI-3 --mode 2560x1080_60.00
This worked but it is not a permanent solution as the next time I were to reboot, I would lose these settings. That made it time to do an Xorg configuration file for this monitor. Thankfully, it is just one simple text document.
Using the handy dandy terminal, once again, I navigated to the appropriate folder
Then instead of creating a standard type of file that could be overwritten like “50-monitor.conf“, I created a custom one for this particular monitor.
sudo nano 49-LG29WK50S.conf
There is not much in this configuration file, just the modeline and preferred mode along with the Identifier of HDMI-3:
This allowed for the Plasma Display module to now have the proper mode available in the drop down and for me to do arrange the screen properly.
And now doing something like video editing feels a lot cleaner and the width only makes this task so much nicer to accomplish.
It’s not a perfect setup but it is a more perfect setup than what I had. What is nice is that I can very easily tile windows and jump to different applications without playing the, “where did I go” game.
I don’t know if I have any games yet that take advantage of the ultra-wide screen layout but from a productivity standpoint, this is fantastic.
I have been using it as the monitor with which I do CAD and I do like the wider display much better as the side menus are never in the way of the model itself. Also, the extended design history is almost entirely seen on larger models too.
Although the DPI is not the same between the laptop and the ultra-wide, I am happy with it. I don’t even know if I would want this monitor smaller or if maybe it is time to go up to a 15″ laptops screen. That would make the DPI closer to being the same between the laptop and the monitor. I am happy with it after one day of usage and over time, I am sure I will find irritations with the setup.
I want to note that I didn’t go for the curved screen. I don’t think I am quite ready for such a “radical” idea of having a screen curved towards me. Would it have been better? Maybe, I can’t really say and maybe the next screen I purchase will be curved so that I can compare. The way I see it, going from 16:9 resolution to 64:27 (21:9) was enough of a jump. Adding another bit of unfamiliarity of a curve in the display might have just thrown me off (insert smile emoji).
I have more “testing” to do with the monitor but for the $179.10 I spent on it, I think it was worth it. The contrast is nice, the brightness is nice, everything is very pleasing. This might very well be one of the best technology purchases I have made. I much prefer this to the ad-hoc, fabri-cobbled setup I previously had.
It happened again, unfettered access to the Internet has yielded yet another romp down the bunny trail to places that I don’t have time in which to go. I have previously discovery of the Ultimate 64, a modern replacement board for the Commodore 64, now I have come upon another amazing bit of tech. A modern replacement keyboard for the Commodore 64, the MechBoard64.
Every day, when I walk back to my “healing bench,” the place I fix my kids toys or things I break around the house, I see my extra, empty bread-bin box Commodore 64 shell. It has been sitting empty since sometime in the early 90s and my mind will wonder to a place where that would be a functional computer once again. Not that I need another Commodore 64, but I am thinking, often, I would like to have a modern re-implementation of the Commodore 64, specifically, with that Ultimate 64. When I play games or do IRC with the Commodore 64, I am periodically reminded that old hardware can have some unwelcome hiccups and remind me why we moved beyond the 8-bit era. Some behaviors of it are just not very welcome. Glitching out, occasional crashing after hours of usage, lack of complete drive compatibility with the SD2IEC device and so forth. I would like to have the best of both worlds, 8-bit fun and charm along with the modern conveniences of storage and reliability. Is that too much to ask?
I am not a huge proponent of the whole microswtich / mechanical keyboard craze, however, in the case of a Commodore 64, I think I very much indeed am interested in such. The MechBoard64 has microswitches from Gateron fitted on a black PCB and mounted in laser cut aluminum brackets. The aesthetic quality of the craftsmanship of that aluminum bracket with those microswitches affixed to the top has me gazing at it, very much desiring to posses such a carefully engineered labor of love.
I must admit, it is a beautiful looking piece of hardware and everything about it makes me want to build a re-implemented Commodore 64 to enjoy the reliability of the new hardware along with the charm and fun of the 8-bit era.
I have priced none of this out, but I am intending on taking that old case of mine,
an Ultimate 64 and a new MechBoard64 keyboard to have a premium Commodore 64 experience.
I would be short a user port but I am willing to bet I can do something more with those USB ports. I am going to pause and day dream of the possibilities…
Since life has yielded me some extra time to think and do things, the time is right to go forth and make my childhood and adult fantasies coalesce into something I can use day-in and day-out. Nothing against the original hardware but the reliability concerns have made it less fun to share with the next generation of technical enthusiasts. I realize it is not exactly the same, but using the 37 year old hardware regularly does put it at risk of a fatal end. I can’t help but come to the conclusion that I should protect it and keep it from the harmful effects of its usage .
This is going to be a fun project, not yet, but soon. A re-implemented Commodore 64 with modern technology sounds like a winning proposition for those retro-computing compulsions. New, yet not, familiar, yet not. How fun!
I love my retro tech. Old computers are just great things. I moved to the Amiga after the Commodore 64 in the early 1990s. I stumbled upon this site and now I want to turn to my aging Amiga 1200 and black it out.
So, why would I want to do this to my Amiga 1200? Well, my old case is yellowing and so are the keys. The keys and I have never really liked that biscuit and gray look. When I saw the Amiga CDTV with its black keyboard and case, I thought how cool and sleek it looked but I wanted a more traditional computer (at that time) not something that was meant to go on your Hi-Fi stack. Now, today, you can have both the cool black look along with the full fledged Amiga Computer.
What a great time to be an Amiga 1200 owner! Things I only dreamed of, some 25 years ago are now availale today.
There are a lot of colors from which to choose. Many more options than just black but I am not sure why anyone would want any other color than black. The color computers are supposed to be. Regardless of these color variations falling far outside of my preference, they do look pretty cool.
At the time of writing your options are the original white it called “Escom”, black, light grey, grey, pink, light blue, dark blue, orange, rubine red, violet, purple and translucent.
I would personally go with black or dark grey for the cool factor. Outside of wanting a different color of case, if you were anything like me, your computer was opened up with some kind of frequency and being an uncoordinated young teen, your case may be damaged from insertion and removal of the screws or breaking the plastic clips along the back by opening it up in the non-recommended fashion. These new cases have screw brass inserts in all 6 screw towers. So, unless you ‘Magilla Gorilla’ those fasteners, you are not likely to have issues. Also, it’s all screws that hold the case together, no more clips to break. I understand why Commodore did the clips, screws cost money and also add complexity into the manufacturing. They were always looking to pinch a penny.
The bottom trapdoor, instead of being a slab of plastic now offers better cooling with extra vents. The rear trap door (that I never used) has 3 options: The Plain door, like you would have had on your original A1200; a VGA hole for VGA out; and a DVI hole. No HDMI, but I’m good with that. I’m sure that can easily be remedied.
If you don’t have an actual Amiga 1200 motherboard to put in this, that is NOT a problem as this supports more than just the original board. Smartly, you can use a Raspberry Pi, MiST FPGA, Keyrah V2 keyboard adapter, RapidRoad DoubleUSB and Lotharek HxC Floppy emulator. So essentially pair this with an original keyboard or a new mechanical keyboard designed to fix this case and you are off to the races!
Since my keyboard is in good shape, I haven’t ever spilled anything on it or eat Cheetos while playing games as a young teen, I am looking to replace the keycaps. It should also be noted that the new mechanical keyboard isn’t ready for purchase yet. The look of these keys are great. Black and dark grey looks absolutely fantastic. My keyboard has the UK layout so this would fit perfectly.
I am pretty excited about having access to new things for old hardware. What an exciting time to be into Retro Hardware. I hope that this is a successful venture. In order to buy a case, you will have to go to one of the partner site.
This is a great time to be a nerd into 1990s or earlier tech. I have explored a lot of the Commodore 64 side of things and I think it’s time to play in the Amiga sphere a bit now.
I am of the opinion, if you plan to have a desktop computer, and by that I mean a machine without a built in battery, you need to have a UPS or Uninterruptible Power Supply. I am of the belief, go as large as you can reasonably afford. Should you have a power interruption, your computer and equipment will thank you in the best way it knows how, by not turning off unexpectedly and corrupting your data.
I Received this used, APC Smart-UPS 1500 a few years back. They did work when I got them, not for long as the batteries failed. After building my workstation / server / desktop unit that I make do things all the time. I decided, after a power outage, it was time to investigate the failure and fix it. I was 99% sure that the batteries were dead as it was the “Replace Battery” light that was on.
Like in any repair of mine, I find that disassembling it first is the best way to go about it. I have had countless other projects that have gone wrong because I trusted online documentation and batteries for a UPS are never cheap. The first objective is to open it up. The tools required for this was just a crossed-recess (Phillips) screw driver. I opted for the battery operated driver because I am lazy.
After removing the batteries, I was completely certain that they were dead as the multi-meter made that very clear.
Search for Batteries
My initial search for batteries lead me to realize that this was not going to be a cheap repair. My local supplier of batteries had them listed at $54.00 each. Others I found all were comparably as expensive.
Then I stumbled upon a site called BatterySharks.com that had these batteries for sale $48.00… for a pair. For a pair! I double checked the specs from the old batteries to the new batteries and I completed the order.
I can’t guarantee that the prices will stay so low but was certainly a fantastic deal. Shipping wasn’t too bad, another $24.59 which brought the grand total to $72.55. I was thinking, however, I am making a $72 gamble that there isn’t anything else wrong with this UPS.
It didn’t take long for the batteries to arrive. The rather weighty package landing with a thud on my front porch signaled it had arrived. The actual installation was really quite simple. Just a reversal of disassembly. Installing the terminal connections, reinstalling the protective terminal caps, I used a little double sided sticky tape to hold them together like the originals (and thinking about it, totally unnecessary), and screwed it back together. Extremely basic.
I did clean up the corrosion with white vinegar just to be sure that there wouldn’t be any issues from the old battery acid that leaked.
After assembling it, it was time to do the initial “smoke test” to make sure that I didn’t mess anything up. Sure enough, I turned it on and an incredibly uneventful yet thrilling beep followed that meant, all was well and ready to be used. I did want to do some testing.
This included using a laptop hooked up to see that when the grid power was removed from the UPS, that it would continue kicking out uninterrupted power. Upon removal of the power, a clunk with a 60 hz hum sound coupled with an alarming “beep” to signal the loss of power and sure enough, just like its name, the power to my computer was uninterrupted.
I let it sit a while so I could watch to see the battery charge meter climb while it remained plugged into the mains. Since it all seemingly worked well so I shut down my server, router/firewall, access point and switch to plug it into the UPS. The load indicator was fluctuating between 1 and 3 bars out of 5 while I was standing there and monitoring it for a while. That was good news as it is well within the limits of this newly repaired but well aged device.
Within a week of installing this newly finished UPS system, the power went out at my house. The server, and network equipment kept chugging along and the battery charge held surprisingly well. Since I was using my laptop, I could still access all things on the server, wirelessly, though I was unsure as to how long it would hold out. After about 40 minutes or so, I thought I should probably start shutting things down nicely. I checked the display and I still had plenty of battery to go so I left it and within 20 minutes of that, the power was restored.
The timing of this repair couldn’t have been better…
This was one of those projects that was well worth the time and effort. I do know that I can connect this UPS up to a computer and have it do things but I really am not sure what. I think I need to start playing with the power awareness features so I can figure out how to safely shut down my server and Firewall safely should power levels get low.
Buying a new UPS can be quite expensive. Repairing a used one is much more affordable and also, a better choice for reducing e-waste. Hopefully, this little writeup and crap-tastic video will give someone just enough courage to try it out themselves.
I recently purchased a new Logitech wireless keyboard for my kitchen computer because the Bluetooth keyboard I previously used was driving me nuts. Mostly for the keyboard layout and sometimes because it didn’t want to connect. Possibly due to hardware failure or bad design. It also doesn’t have media keys so I thought it best just to replace it.
I have previously used ltunify with success but I only used it because “L” comes before “S” so that was my first stop. Since I received feedback that I should try Solaar I did so this time. Since there isn’t an official Linux based application available from Logitech, the fine open source community has stepped in to make managing your devices simple and straight forward.
Once it is installed, launch it using your preferred method, the menu, Krunner, etc.
Right off the cuff, this is a more user friendly application with some additional features. For starters, whatever devices you have connected to your Logitec receiver will display a battery status. In this case below. I have a keyboard and mouse already paired with the Unifying receiver.
Logical layout of the device listing, and verbose device information, device options and battery status. What is nice about this application is having the ability to modify the status of the device. My K400 Plus keyboard has the Function Keys and the media keys set up as such that by default, they are media keys. This is not what I prefer so I can Swap the Fx function here.
Pairing A New Device
My reason for using this application was to pair my new keyboard with an existing receiver. I don’t see the value in having more than one USB port populated unnecessarily. To Pair a new device is very straight forward, select the root “Unifying Receiver” and select “Pair”. The dialog will pop up and ask you to turn on the device you want to pair.
When you do that, the receiver will grab the new device, associate it and have it available to be used.
That is all there is to it. Each device will have their own set of options that are adjustable per your preferences. This Performance MX Mouse has more options than the value M175 Mouse.
That just about does it for Solar. There are some other fun features like getting device details but I don’t really want to post those here because I don’t really know if that is information I should be sharing!
Having Solaar in the system try is quite handy. Though, the reality is, I don’t need it all the time but having it to manage your devices is very handy. It’s nice to know that you can manage multiple Unifying receivers with this application. This is easy to use and has a great, well laid out and straight forward interface. I am glad I was recommended to try this application out.
I recently posted about my computer build. In short, this is a computer build on parts that are in no way considered top of the line. They are all quite old and that did pose a few problems. One, this motherboard would not boot from a software RAID pool. I was able to bootstrap the BTRFS RAID pool with a separate drive and root partition. It did add some complexity to my system but I think it works out okay.
Building a system is something I have wanted to do for quite some time. As in, several years but time, finances and decision vapor-lock had kept me from it. What pushed me over was a fortuitous conversation at a Christmas gathering last year, I struck a nerdy conversation, with a computer store owner that ultimately gave me this giant Thermaltake case without a motherboard and a few weeks later, another fortuitous happening where I was given a bunch of old computer equipment and an AM3 motherboard was among the rest of the aged equipment which drove the rest of the build. My course of action was to stuff the most memory and fastest processor in that which is what I did and I am happy with it. I am not going to belabor that process as I have talked about it before and I have a link you can follow if you are interested in those details.
As a result of this, I had tons of fun, it was a great learning experience and that same guy gave me another case, not as big but far more robust in design with a water cooler. I now want to build another machine but I am thinking a more pure gaming machine and leave this current machine to be my server workstation. I don’t know when I would get to this but I think this one will be a project I do with my kids. Use it as a teaching opportunity and turn it into a kind of family event. Currently, the machine has a Core 2 Duo CPU platform of some kind. I think I would probably do another AMD build, something newer that can take advantage of these new fancy CPUs coming out. I still wouldn’t go bleeding edge but certainly something closer than what I have now.
I have fully evaluated my use of Emby and given a little write up on it. I described the installation process, setting it up, importing my media files and so forth. I want to just summarize the highlights and the lowlights of my experience before I begin my next testing of Plex.
What I like
Emby is super easy to set up. It is nothing more than copying one line into a terminal and executing it. It is super simple and the script also seems to, at least on the version I installed, start and enable the emby-server service.
It’s super easy to add media libraries to Emby. The wizard walks you through it in the beginning and if you want to add additional libraries, that is very easy to do through the configuration tool.
Streams to just about everything in the house. Essentially, if it has a browser, you have access to the Emby server. I haven’t had any issues with the system in the approximately five weeks I have been using it.
Updating the metadata and identity of any movies is as easy as a click and search. You can change the cover images and so forth. Some of the movies I have ripped haven’t always been detected completely correctly. For example, there are three different Grinch movies and I had to manually define which decade they came from. It was super easy.
The Android application works quite nicely. I am actually impressed with the ease of use of the application. It also has quite the fine polish to it as well.
What I don’t like
This was an open source project that went closed source. I sort of have an issue with that and I am not alone with that assessment. It was at that point that Jellyfin was forked from Emby which is what makes me incredibly interested in Jellyfin.
I can’t stream to my Wii, though I don’t really blame the project for not supporting a 14 year old game console. There isn’t an app on the Homebrew channel though at the time of writing, I realized that there is a browser on the Wii so perhaps more investigation is needed. I will update this paragraph with any new information I learn as I investigate that possibility.
Updates will have to be done manually. The server does say it needs to be updated and to do so requires the same step as installation. That is really the only clunky part about this whole setup.
Emby is pretty great. Regardless of what I do not like about it. It is a great experience. If you are undecided on your media server and have a desire to try the different options, this is a good one. If this was my only option, I could easily get along fine with it. Since I have two others, I will check those out too.
I highly recommend you try out Emby as the shortcomings are nitpick issues. I don’t like that it went closed source but the project, closed or open, is sound. It is a great, well polished, experience.
This is my first media server review. I will have follow up articles to this in the near future. If there are any inaccuracies or areas I need to revisit, please let me know and I will take the time to make updates.
It did take me a quick start tutorial to get going. I do kind of wish there were more instructions on how to do things that weren’t in video form. I like video and all but it is too slow to go through. I would rather scan down a page and see little clips of how each effect is done on it’s own. I suppose there is nothing stopping me from doing that.
Kdenlive is easy to just get going with it. Once you understand the work flow, dump your videos, music, pictures and such in the “pot-o-media” and you are off to the races.
What I Like
Kdenlive is incredibly stable and reliable. Crashing is incredibly rare. I have spent many hours at a time editing and not once has Kdenlive crashed. In all fairness, it’s been hours of editing because I am not very good at it. I have used and rendered video on both my Dell Latitude E6440 and my “new” AMD FX-9590 system with out any glitching or issues. I am impressed by the stability and smooth operation of Kdenlive on openSUSE Tumbleweed.
The user interface of Kdenlive makes sense. The shortcuts, the ease of defining the effects and transition as well as previewing the video makes for an easy and enjoyable video editing experience. Even the scrolling across the timeline or through the tracks, all just makes intuitive sense.
The options for rendering videos or even just audio has a straight forward interface that makes it quite clear what is happening when you start that render. Also, when you start the render, you can continue to use Kdenlive. It does not lock you out of the application.
What I Don’t Like
The text editor for title screens is a bit ropey. The cursor indicator isn’t always visible so I often have to make special effort to get to the right location which includes some delete and retype from time to time. The use of it is not as much fun as the rest of the application.
Not so much a fault of the application but doing video editing really needs more screen real-estate. One 1080p screen is not enough. Not the fault of the application but it is hard to see and read everything going on without excessive scrolling.
Kdenlive is a great application with a lot more features than I know how to even use. I don’t do any complex video editing. I don’t have good video equipment so I don’t have a real high level of motivation to create a lot of video content at this time. You can only polish a turd so much and I am often not happy with the video I shoot. I am happy, however, with what I can do with the video in Kdenlive. It does make turning the lack-luster video into barely acceptable video content. Editing with Kdenlive is easy to use and is enjoyable to turn the mess I start with into something more usable. I would like to make more excuses to do more video content because the great user experience Kdenlive provides.
I have heard of people complain that Kdenlive isn’t stable, well, that is a bunch of hooey. Kdenlive on openSUSE Tumbleweed works fantastically well without any crashing. I am very thankful for fantastic packaging and QA process from the openSUSE Project and I am very grateful for every programmer that has had a hand in every piece of this, from the Linux kernel to the Plasma desktop to the application itself. Thank you for all your time and efforts.
Linux Powered Festive Lights
Move from Christmastime to Wintertime. One of those I like a lot more than the other but by having “winter lights” it brightens up the space around me and pushes that gray, cold, dark sadness away. Thanks Linux!
Anything multi-colored and Christmas specific has been taken down. The strands of multicolor lights on my porch have been replaced by blue lights. The wreath and Santa are down but in Santa’s place is an inflatable snowman. Everyting is now white and blue around my house. Not as much fun as Christmastime but I think there is a rule about how much fun you can have at any point in time of the year and I don’t want to over indulge in it. I have to keep it for the designated times, be seasonally appropriate.
I have purchased a few other little things to add to my display. What can I say, I enjoy talking about it. More on that in the future.
Of the three Universal package installers, AppImage is one of them. Historically, it has been my least favorite due to the more squirrely way of managing each application. Essentially, you had these files are scattered about your file system or shoved in a folder some place and if you wanted to put them in the menu, you had to do it manually. When you downloaded an update, because not all AppImages support updating, you had to recreate or edit the menu entry and lacks all sense of intuitiveness. It is just incredibly primative
Some AppImages would integrate themselves into your menu system and even perform their own updates. Most of them, however, do not implement those nice little features. Another step, before launching it, having to modify the properties to make it executable. Not a difficult step but it was another step that made it feel a little clunky to use. Combine all these anti-features together and it was my least favorite Universal package. Sill grateful, just least interested.
Step in AppImageLauncher. This throws a significant change in the Universal Package landscape. I have been favoring Snaps for many reasons: the central repository, the ease-of-use in the command line or graphical tools (I used the command line), automatic updates and vast selection of applications has made it my first stop in the Universal Package route. The next has been Flatpak. It has a pseudo Central Repository, nothing official, it integrates nicely with Plasma’s Application Explorer, Discover. Flatpak has recently been better about automatic updates and does a fantastic job of theming itself automatically to your desktop settings.
Lastly has been AppImages because of the rather ad-hoc nature and disjointed desktop experience they have provided. They would respect your desktop themes and are a great non-committal way to try an application but lacked a convenient and clean way to access them. I have used AppImageLauncher for such a short period of time but it is a game changer as far as desktop experience is concerned. The ease of installation and removal of your application in the menu and the automatic organization makes for a purposefully integrated experience. You really can’t tell that you are using an AppImage unless you are doing a right click in the menu entry. Now, on my openSUSE systems, AppImage is a first-class citizen beside my system package manager (RPMs), Snaps, or Flatpak. 2020 is starting of great in the software world.
So why would you use the AUR?
BDLL Follow up
Something that doesn’t seem to get enough attention is the BDLL Discourse Forum. There is a lot of great discussion going on there, not just because I am dumping everything I am working on there but because it is a great place to get help, talk about your Linuxy experiences and just have great conversation about interesting things in tech.
The Linux Half Top was a thread submitted by Steve (Mowest). He had a broken laptop screen and instead of dumping $100 plus into the machine for a new screen and touch panel, he took the screen off entirely, added an HDMI to VGA adapter. Steve gave credit to another community member Dalton Durst for the idea. It reminded Sleepy Eyes Vince of the Commodore 64 where the computer was in the keyboard and just needed a screen.
The whole idea was brilliant, simply brilliant and was an exercise in problem solving by looking for an entirely different solution. Well done.
I highly recommend you take a trip to the BDLL Discourse for some very interesting discussion, discoveries and ideas.
postgresql10 (10.10 -> 10.11) 59 line item changes applied to PostgresQL
xfce4-terminal (0.8.8 -> 0.8.9.1) Respect the “Working Directory” setting when opening initial window, Fix invalid geometry on Wayland, and several other polishing improvements.
xfce4-branding-openSUSE (4.14+20191207 -> 4.14+20191230) several packages relating to openSUSE branding which included setting the default cursor to Adwaita
libvirt had CVE-2019-11135 addressed
ALSA (18.104.22.168 -> 22.214.171.124) several upstream fixes and UCM and UCMv2 fixes and enhancements. See Changes
NetworkManager (1.18.4 -> 1.22.2) Fix multiple issues in the internal DHCP client, including: wrong parsing of search domains and classless routes options, and failures in obtaining and renewing the lease with certain server configurations.
flatpak (1.4.3 -> 1.6.0) several fixes to include fixing some leaks and not to poll for updates in the portal when on a metered connection.
Catfish (1.4.11 -> 1.4.12) for Wayland and GNOME Shell support
Ffmpeg-4 numerous subpackage updates
SSHfs (3.6.0 -> 3.7.0) to give you higher max connection to improve responsiveness during large file transfers.
Four more snapshots are in the pipeline and at pending stable scores
Computer History Retrospective
I was recently watching an episode of Computer Chronicles that covered the idea of “Simulator Software” recorded in 1983. They talked of the flight simulators of the time, simulations of architecture and urban design. Even in the 1980s they were saving money by doing virtual testing of an environment before you spend the time and money on the real thing.
There was a flight simulator used by the military in the early 1980s that by today’s standards, not so great but if I were running that on an Amiga or x86 based PC in the mid-90s, it would have been pretty darn impressive yet.
It is interesting to see now, the graphics capabilities have advanced. Any one modern graphics card has such incredible graphical capabilities, delivering fantastic realism. It’s something that is pretty amazing if you stop and think about it.
I can’t help but wonder how those ideas were sold at the time to punch information into a computer that by all accounts is not all that capable of calculating the vast sets of variables that are done today. Today, there is so much more that can be done with finite element analysis in software that you don’t have to pay for. Examples of this are, FreeCAD and Fusion360, one an open source application, the second a close source application but free to use for hobbyists.
This is a great episode of the Computer Chronicles if you are interested in seeing the early development of computer simulation in the early 80s. The excitement around it is pretty fascinating and we can thank these people for pushing the technology from which we enjoy the fruits today.
Some time ago I started noodling around the idea of building a replacement server for my home. I wanted to make this an extreme budget build. I came to the realization that I have become rather disconnected with the state of desktop class video cards and really much of anything that was outside of the laptop world. I was hung up, for quite some time on the case and motherboard selection. I would browse Newegg and eBay but since I lacked a lot of information, I was in a constant state of decision-vapor-lock. What changed was when I received some hardware at no cost. An incredibly large case and an AMD motherboard locked in the portion of the project that I was unable to make any decisions and dictated the rest of the build. So, over a period of months, I slowly acquired rest of needed components.
The case, although in good condition, certainly looks like it was at some point out moded and just became a place that parts were thrown into. I would guess this case is as old as my Linux jouirney.
The motherboard that was given to me was an AM3/AM3+ motherboard. I was actually kind of excited about this as I decided I was going to do a complete AMD build. Sure, this is an older AMD CPU socket with a silkscreen date on the board of 2013 but that meant getting something on the cheap was certain. Also, since I don’t exactly buy new things, this fit the bill.
This is what ended up getting, mostly from eBay, so for you to replicate this selection at this price may or may not be possible.
Power Supply – RaidMax RX-1000AP-S Power Supply – $74.19
CPU – AMD FX-9590 – $119.95
CPU Cooler – Cooler Master Hyper 212 Plus – $22.59
Memory – 32 GiB DDR3 1866MHz – $64.95
Storage – 6, Seagate 2TB drives – $149.70
6-port SATA Card – $25.35
USB 3 All-in-one 5.25″ Front Panel Card reader – $19.99
Blu-ray DVD player – $50.00
2x 3.5″ to 5.25″ adapter trays – $8.58
Serial DB9 RS232 9pin com port with bracket – $4.14
6x SATA Cables – $9.48
That made a grand total of $638.87 invested in this machine. I went just a bit overbudget due to the CPU cooler. I was warned that the TDP rating on the CPU meant it was necessary to have an effective cooler.
This was the first time I have actually assembled from parts and pieces a computer. I have repaired and upgraded many but this was the first of this level of DIY. Since every part I had was untested and I had no way to verify if anything was working, as in, nothing else upon which I could conduct individual component testing, there were a lot of uncertainties in this.
When I kicked it on for the first time and had everything working, I was incredibly relieved that it all worked. There weren’t any issues at all with any of the components.
To see this machine actually start up and work in a kind of cobbled together state was not too far short of a miracle. I was very fortunate that all the used hardware actually worked.
Operating System | openSUSE Tumbleweed
There really wasn’t any other choice. I need long term reliability and I am not interested in reinstalling the operating system. I know, through personal experience, that Tumbleweed works well with server applications, is very tolerant to delayed updates and will just keep chugging away.
I have been very satisfied with the stability of Tumbleweed as a server for the last year on my “temporary system” performing that role. The issues I did have with that system, although minor, have been with video due to the Nvidia GPU. This build, I purposely avoided anything to do with Nvidia due to the dubious support they provide.
This was an area that took me several months of research and reading. My criteria was that I had to have Storage Array BTRFS Raid 10. This afforded me a lot of redundancy but also a lot of flexibility. This will allow me to slowly upgrade my dries capacity as they begin to fail.
When deciding the file system, I did a lot of research into my options. I talked to a lot of people. ZFS lost consideration due to the lack of support in Linux. I am perfectly aware that the development is done primarily within Linux now but it is not part of the mainline kernel and I do not want to risk the module breaking when the kernel updates. So, that was a non-starter.
I looked at a few LVM options but if I wasn’t confident in understanding all the details of it and I didn’t want to risk any reliability due to my ignorance. Why I ended up using BTRFS is due to the reliability and flexibility of the file system. Anyone that says RAID 10 on BTRFS is not reliable is, sadly mistaken.
Since the motherboard I have wouldn’t recognize a software RAID and boot from it, I used a 7th drive to bootstrap the whole system. That, also running BTRFS for the root file system and I threw in some Swap as well.
Used a 6-port SATA card for the 6 drives of the BTRFS RAID array and mounted it as /home. At some point, I want to take advantage of the subvolume capabilities of BTRFS but that will come at a later time.
Since this is my new central computer, as it were, I wanted this to have all the faculties for doing the regular nonsense that I conduct in my SuperCubicle. Since it seems I have made a bit of a reputation for doing computer-y things, I tend to help other people out in data recovery, backing up their systems and so forth. I also like to mess with Single Board Computers and although I can stick an SD Card in my laptop. I wanted something with all the media cards in it and external SATA ports for plugging in drives as well. This already had some USB and SATA connections on the top of the case. The 5.25 Media Dashboard has SD, MS, MMC, XD, TF, M2, CF and SATA interfaces. There is also a power connector port and USB3. I have used many of these interfaces already. As a bonus, this has a temperature sensor that I attached to the CPU cooler that tells me what the temperature of that monstrosity is. It really hasn’t gotten real hot yet but I will see how hot I can get it after I really start pushing it.
The optical drive is also getting a regular workout as I have been dipping into the bargain bin of post-Christmas season movies to add to my media collection. All in all, this has been the perfect hardware build for me and my purposes. As it stands today, I only have 3 open bays on this machine so anything smaller, just wouldn’t do.
I didn’t just build this system to look old in my basement. I have had plans for this thing for longer than many of the parts. My number one task is that this machine is my central repository of all my data. Everything from records to movies. To that end, outside of the standard server functions you have by “flipping a couple switches” like Secure Shell, Samba, Syncthing, I wanted to go beyond this. Something “cool!”
Currently testing Emby, PLEX and Jellyfin. This is probably what this machine does most right now. That and ripping the DVDs and Blu-rays I purchase using MakeMKV (Another blathering for another time). This function doesn’t seem to be very taxing on memory or processing power. Maybe if I had more machine drawing media from it it would but that is not an issue at this time.
Although I am not exactly doing much gaming, I think I played a game of River City Ransom: Underground with my youngest. I have also played Descent 2 (rebirth) on this machine, and it, of course, ran it extremely fast. At this point, I haven’t come close to taxing the video card. I am planning to do more Linux gaming with it and by that, I mean, anything that I can run in the Linux environment, so Wine and Proton, those are also fair “game”.
Since this is the most capable machine I own, I’m using this to render video. It does the task in 1/3rd the time of my Dell Latitude E6440. Would faster be nicer, sure, but I don’t exactly churn out lots of video content for it to matter. I still tend to edit the video on my laptop but render it on this machine. Mostly because I don’t have great monitors for it yet. That will come later.
I will be implementing a Nextcloud server and start playing around with some note taking applications that I can self-host. Not that I am unsatisfied with Simplenote, I just happen to like to keep my options open.
Another service I want to run is Home Assistant. I have these plans for implementing “smart devices” that are not cloud based going off someplace else. I want to have Home Assistant, manage all my devices and make my home just a bit more convenient. That is also another blathering for another time.
I had originally intended to make a video of the build of this, to include the installation process, but after reviewing the video and being bored out of my mind watching it, I have kicked that to the curb and will maybe turn that into an 1980s sitcom montage to music or something.
Although this computer has only been up and running for about two months, I am slowly adding more services and functions to it. For now, it is pretty light, but in a few short months, that will most certainly start growing. I am very happy happy with the sub-$700 build for a computer system that has met or exceeded my expectations. It was a fun first complete, from ground up, scrap-together assembly that really was a gamble. I am pleased with how well openSUSE Tumbleweed runs on it and that I have had no disturbances with any operating system updates.
Often, after a project, you will review it, have an “After Action Review” and ask yourself, “What would I do differently if I were doing this again.” I can honestly say, there is nothing I would change. I like everything about this machine. I would, perhaps, like more storage space as I have already gobbled up 2.5 TiB of my 5.5 TiB of storage space. Reviewing what I spent and the additional cost of the larger storage, I would have still made the same decision. So, back to would I change anything? No, I think I made the right decision. I do have upgrades planned for the future but that is a project for the fall. This machine truly fits my needs, even if much of the hardware is yester-years retired bits.
Fusion 360 is a CAD / CAM application with finite element analysis capabilities. I was going through the Autodesk forums and read a lot of chatter about their position on the Linux client. It appears that for several years, there have been requests but there is no plan to support it.
One user gave a fantastic well thought out, logical reason for building Fusion 360 to work in Linux and he gave the typical reasons for not doing so with answers:
the management sees not enough customers here. It’s a question about cost/income ratio.
I think if done right, there are not much costs (keyword continuous integration)
Number of potential customers. Linux users need to raise there hand and write to Autodesk, so that they can see, there are potential customers. Linux leads already on the server market, and on embedded devices, smart phones and tablets (if you count Android as Linux).
On the desktop, Windows is still the dominating system (88%), Mac (9%), Linux (2%). But this is for the average user, this doesn’t need to be true for engineers and makers using CAD software.
I have no statistic here, but I personally have never seen engineers working on Mac. But I have seen many engineers, software developers and scientists that work on Linux.
Linux users are willing to Beta test and are able to generally figure things out for themselves.
There were a lot of hostile responses from Windows users that were just… hostile. I do think that is a large part of the untold story. There are those that point to Linux and talk of the technological elitism but I don’t think that is a behavior that exclusive to Linux users at all. I can refer to this post for evidence otherwise.
Even though Autodesk has stated that they have no plans to support Linux, it is always with the caveat that of “at this time.” I still have hope that Linux will be supported in the future. It’s inevitable as there are a larger percentage of Linux users in the engineering field, Autodesk does support Linux on the Maya application and since there are more and more professional tools on Linux, I truly believe it will follow.
It took me far too long to complete the write up and video but I must say that the tiling features in Plasma are pretty fantastic. I spent this past weekend doing a lot of administrative work for another job of mine and the tiling manipulation of windows and desktop navigation made the tasks far less painful than they have been historically. I have to emphasis once again that it is important to have key combinations that make sense that are easy to remember that can are quickly intuitive to you.
I made a little video about this with Kdenlive and put it on YouTube. I had a less than stellar comment about my production quality. For that, I can say, I’ll try better next time.
I did a post this last week on my use of Linux in the kitchen. I did appreciate a lot of the great feedback II received from this. I don’t want to understate, at all the value of technology in the kitchen. It is not at all a strange science experiment being shoe-horned into a role in which it doesn’t make sense. Linux and the array of tools make several kitchen tasks more efficiently completed.
For my case, the right hardware was an important part of the implementation as I have a very limited amount of counter space. There were already several software applications I had been using, I just happen to further expand how I had been using them.
How it recently made the Christmas season more efficient…
What would improve Linux in the Kitchen is going to take some real effort on my part. Most of these things will be aided by single board computers or IoT like devices. I need more metrics in order to improve my results when baking. Improved inventory management, improved meal planning. All but the last one will take some serious work and effort in order to implement.
BDLL Follow Up
Fedora 31 challenge. Lot of people were rough on it and in some ways I understand but in others I do not. I have used Fedora periodically and I have always found it to be an enjoyable experience. Fedora is a lot more like getting a Lego set with some instructions than it is a ready-made product. I look at Fedora as being a more industrial grade Linux system that you implement for a specific feature. While distributions from the Ubuntu flavors are more like products that are ready to be used that focus on the out-of-box experience. All the flavors of Linux have a place and a target audience. Everyone is entitled to their own opinions about a distribution experience but I think it is almost a bit unfair to evaluate Fedora in the same way you would evaluate an Ubuntu.
I have decided to use Fedora’s Plasma edition and I am going to give it a fair, but biased, review. My expectations are very focused. I don’t need the “last mile” type polish, nor do I expect that from a Fedora or an openSUSE for that matter. What I do expect is something very easy to work with and mold to my wishes.
openSUSE does a great Plasma. I don’t mean out-of-the-box perfect for my needs. No distribution should ever target me as the core user, that would be tremendously silly. I am an edge case and I am never satisfied, I am a moving target of requirements and expectations for what I want as my personal workspace. I would be a high maintenance target for a perfect out-of-box experience.
wiggle (1.1 -> 1.2) a program for applying patches that ‘patch’ cannot apply due to conflicting changes in the original. Wiggle will always apply all changes in the patch to the original. If it cannot find a way to cleanly apply a patch, it inserts it in the original in a manner similar to ‘merge’ and reports an unresolvable conflict.
bubblewrap (0.3.3 -> 0.4.0) The biggest feature in this release is the support for joining existing user and pid namespaces. This doesn’t work in the setuid mode (at the moment). Other changes include Stores namespace info in status json, In setuid mode pid 1 is now marked dumpable also now build with musl libc. gthumb (3.8.2 -> 3.8.3)
gnome-shell (3.34.2+0 -> 3.34.2+2): polkitAgent, Only set key focus to password entry after opening dialog. The keyboard now stops accessing deprecated actor property. libnl3 (3.4 -> 3.5.0) * xfrmi: introduce XFRM interfaces support xfrm: fix memory corruption (dangling pointer) mypy (0.720 -> 0.750) More Precise Error Locations and the daemon is No Longer Experimental python-Sphinx (2.2.2 -> 2.3.1) python-Sphinx-test (2.2.2 -> 2.3.1) python-jedi (0.15.1 -> 0.15.2) python-mysqlclient python-parso (0.5.1 -> 0.5.2) python-pybind11 (2.4.2 -> 2.4.3) python-typeshed (0.0.1+git.1562136779.4af283e1 -> 0.0.1+git.20191227.21a9e696)
wireshark (3.0.7 -> 3.2.0) bug fixes and updated protocol support as listed
Firefox (70.0.1 > 71.0) Improvements to Lockwise, integrated password manager, More information about Enhanced Tracking Protection in action, Native MP3 decoding on Windows, Linux, and macOS, Configuration page (about:config) reimplemented in HTML, New kiosk mode functionality, which allows maximum screen space for customer-facing displays. Numerous CVEs were addressed relating to memory.
I think we often take for granted the multimedia capabilities of computers today. It seems like someone is always harping about PulseAudio on Linux. I’d say they are likely not using the right distribution, by that I mean openSUSE, I don’t have these issues. The purpose of the section is not to tout the superiority of my favorite operating system when it comes to audio subsystem, rather, it is to talk and reflect about how great we have it today with all things audio on modern computers.
In 1983, the state of digital music was not as rich as it is today. We can enjoy a virtually endless supply of content never before available in human history. Let’s go back in time to an era when the Commodore 64 was the pinnacle in home computer audio. Where audio was entirely programmed, limited to 4 wave forms of sawtooth, triangle, pulse and noise. A multi-mode filter featuring low-pass, high-pass and band pass outputs and three volume controls of attack / decay / sustain / release (ASDR) for each audio oscillator and a few other things I barely understand. Regardless, the capabilities were limited and synthesizing voice was an incredible undertaking that took years of work long after the chip was in the wild. This was one of the first polyphonic sound chips on the consumer market that, to this day, is held in high regard and many still like the sounds this chip produces.
All this said, this was very interesting record of computer generated music that is certainly worth a listen. I find the experimentation and musical education tools used in this perod incredibly fascinating. Today, things are very different. Musical composers and artists use computers in music production and to do so otherwise would likely be considered insane. I now wonder if individuals in the 80s that pushed the art and science of computers in music were considered insane by their peers.
Post Christmas Day shopping yielded me a really nice find, specifically something pretty fantastic from Lowe’s that allows me to fix my AC light strands. A Holiday Living Light Tester. The directions could have been a bit more clear… maybe worth a video… but I was able to recover three of my LED bush nets. Since they retail for about $10 each, that has made the purchase worth it already. This device is supposed to work with LED as well as incandescent lights. I’ve only tested it on LED thus far and it works well.
This is a device that I wish I had discovered long ago.
Christmas Lights Sequence to Music with xLights
Very comprehensive software that allows you to look at the wave forms, change playback speed and make it easier to adjust the actions to occur at the right time. I’ve only began to scratch the surface of the power and capability of this and the reality is, I don’t know what I don’t know on using this software. My set up is really quite simple, therefor I can’t take full advantage of its capabilities.
Some of my favorite effects to date are the butterfly, marquee, fireworks, life and fan. They currently give me the visual excitement for which I am looking to put into the sequences.
There are many more effects to discover but due to the limited nature of my display as it currently is, I can’t do some of the more fancy enhanced items, yet.
I recorded a two videos an posted them to YouTube, they are nothing terribly special, but I am quite pleased with how it turned out.
Funny aside, I went to record the second sequence and there was a car parked in front of my house, waiting to watch it.
I did decide to employ an FM transmitter so that people can listen to the music in their vehicle but I don’t actually have a sign to inform that fact.
The old boy on the block that is well known. I haven’t used or tried it yet but this is still the one I hear the most about. Because it is the popular one, I tend to go for other things… for reasons unknown
This will be the next version I try. I have noticed that they do have a Docker image so I am going to take this as an opportunity to learn some things about docker while I’m at it. The key feature of this one is it is completely open source and that has a great appeal to me.
This is the media server with which I started this journey and am currently testing. I planned to test the others already but I have been engaged in other matters. It has decent name recognition but did go closed source after they gained some momentum. I have been using this for about a few weeks and the features I like are that it works much like you would expect in Netflix. If you activate notifications, you’ll be notified about a “new release” when you put something in your repository of media. I thought that was kind of cute. Setting it up is pretty trivial and I will be doing a write-up on this as well.
I want to do reviews of each of these media servers with my openSUSE Tumbleweed based workstation / server and see how it goes. Really, there is enough horsepower, I can have all three running and see how each of them, play out, as it were.
Restoring my Nexus 6P To Working Order
As a kind of Christmas gift to myself, I spent the 5th day of Christmas disassembling and installing a new battery into this phone. I shelved the project in August but didn’t put it out of sight. Seeing it almost daily, I’ve had it gnawing on me to get it done and I finally did it.
I bought a battery replacement kit on eBay for this phone that had most of the tools I needed. I had no interest in doing a tear down video as there are plenty of those on YouTube. YouTube Video demonstrating battery replacement of the Nexus 6P. Although the repair of the device was rather annoying and tedious, you know, just difficult enough to scare off smarter people than me, the part that took me the longest was updating the phone and installing LineageOS with everything working.
There was only one issue, really, working cell service. The problem ended up being that the was a security lock out that prevented the SIM from being accessed and disabling it is what ended up fixing it.
As we wrapped up the year in BDLL challenges, our task for this week was to make some predictions about the year 2020. They didn’t have to be Linux related so, exactly but since Linux and tech is the focus of the show, it would only make sense to keep it as such.
What I am wishing for, in 2020, is commercial grade CAD / CAM, manufacturing technology software to come to Linux, not necessarily for home use but for use in business.
Specifically, what I would like to see is Fusion 360 by Autodesk supported in some level on Linux. It already runs well in Linux through Lutris but having actual support for it would be fantastic. I would also like to see PTC’s Creo running on Linux. PTC once supported Linux with earlier offerings of their mechanical design package but no longer do so today. It would be great to see.
Aside from bug fixes, removing dependencies that are not needed, here are some of the highlights of the last six snapshots
Rammina, an rdp client to version 1.3.7 which included improvements to translations, better authentication MessagePanel API, Printer sharing improvements, and various bug fixes
NetworkManger, updated to 1.8.25+20. Applet scales icons for HiDPI displays.
Bluez, the bluetooth stack, received a version update to 5.52. Fixed AVDTP session disconnect timeout handling, disabled one more segfaulting patch, and fixed numerous issues.
KDE Plasma updated to 5.17.4. Discover Fwupd will no longer whine when there is unsupported hardware. Improvements to KWaylend integration, and numerous other fixes and improvements.
GNOME Desktop was updated to 3.34.2 which has undoubtedly further improved the experience for it’s users.
GTK3 updated to 3.24.13+0
Gstreamer Plugins, updated to 1.16.2. Fixed numerous issues in the v4L2video codecs
Wireshark updated to 3.0.7 which addressed CVE-2019-19553 CMS dissector crash
Akonadi has been updated to 19.12.0 There weren’t any features added but improvements and bug fixes were implemented.
Wireguard updated to version 0.0.20191219 that added support for nft and prefer it, and fixed other various issues.
YaST updated to 4.2.47, bug fixes and refinements to how it operates
php7 updated to 7.4.0 where systemd restrictions for FPM were relaxed and other various improvements
Tumbleweed Snapshot Reviewer gives 20191210 a stable 99; 20191211 a stable 99; 20191213 a stable 91, 20191214 a moderate 90; 20191216 a stable 96 and 20191221 a stable 98.
This is a new segment I am going to try out for a few episodes to see how it fits. Since I am vintage tech enthusiast, not an expert, I like looking back and seeing the interesting parallels between the beginning of the home computer or micro-computer revolution compared to now.
The Computer Chronicles is a program that spanned for 20 seasons, starting in 1983. The original hosts, Stewart Cheifet and Gary Kildall’s first episode focused on Mainframes to Minis to Micro computers and it was such a fascinating discussion. Stewart Chiefet asks Gary, right of the bat, if he thinks whether or not we are at the end of the line of the evaluation of computers hardware or if there major new phases of this evolutionary process.
Gary responds with “no” and saying that they are getting smaller, faster and less expensive. He speculated that they will get so small you will lose them like your keys.
Couldn’t help but think if Gary was still alive today, how many times would he have lost his cell phone today and would he think back to those words. I know that I lost my cell phone in my house, the one I just fixed three times.
Watching the demonstration of the TX-0, the first transistor powered computer give a demonstration was quite fascinating.
The Super computer from the 1960s filled entire rooms while they experimented with parallel processing In the 1970s, computers miniaturized to Something resembling a single server rack and were called minis and were considered portable because they were on wheels. The late 70s and into the 80s, micro-computers came into prominence and although substantially cheaper the Mainframes, Minis and Micros, still far more expensive than what can be picked up today.
I found this particular episode very interesting due to the excitement of how small computers were getting but by today’s standards, really quite large. The hunger for speed was just as apparent in 1983 as it is today in 2019… almost 2020.
The micro-computer they demonstrate here is a Hewlett Packard HP-150 which was an attempt at being user friendly with a touch screen interface. Nothing like the touch screens of today as it uses infra red transmitters. It is noteworthy that in the demonstration of the machine by Cyril Yansouni, the General Manger of the PC Group at HP, it was stated that the most intuitive tool to interact with the computer is your finger. That holds true today, looking at how people interact with tablets and mobile devices. The interaction seemed rather clunky by today’s standards but I think it is pretty cool to see the innovation of the time. Mr. Yannsouni also stated that he doesn’t think that this alone is the most ideal interface. He stated that he thinks that there will be some combination of touch, keyboard, mouse and even voice that will be something more idea. I think he was correct on this. This machine, the HP-150 has a kind of goofy look about it but at the same time, pretty cool as well. I’m really glad it was demonstrated.
The direction that was being discussed here was the future of computer technology. Herb Lechner stated that the future will be networking computers together through local area networks so data can be shared. Gary Kildall and Cyril Yansouni speculated, very excitedly, that the data communication will be over the phone system as the future of networking because local networks are too expensive and difficult to set up. I wonder what they would say today about this.
What I really learned from this particular episode is that, one, our desire for smaller, faster, better computers hasn’t changed. There was experimentation on form and function of computers with what the best of technology had to offer for the time and there was lots of fragmentation, far more than anything we have today. I also learned that most of the experts tend to be wrong about the future of technology, that hasn’t changed today either.
2020 is on the horizon, and to quote my favorite fictional character of all time, Doc Brown, “the future is whatever you make it, so make it a good one.” Make 2020 the best year you can, be kind to one another and should things not go as you planned, don’t hold any resentment against yourself or those around you.
As a kind of Christmas gift to myself, I spent the 5th day of Christmas disassembling and installing a new battery into this “shelved” phone of mine. It is something I have wanted to do since the battery started fading and I finally got to it.
I bought a battery replacement kit on eBay for this phone that had most of the tools I needed. I am not going to provide you a tear down video, there are plenty of those on YouTube and if you are interested in that, click here. This will tell you everything you need to know and possibly more. I am going to focus more on the ares of difficulty and the installation of LineageOS.
Pixel was an okay phone but was a bit too small for my hand, I didn’t like how it fit in my phone holder in my truck, the battery didn’t end up being much better on that phone after about 6 months of use and I couldn’t put LineageOS on it because it is locked down.
The video gives you a list of tools to use to do the repair. I didn’t have everything, exactly as they suggested. I grabbed my whole kit of tools available and this is what I ended up using:
Plastic triangle opening tool of two different thicknesses
Tweezers. I used whatever tweezers I had in my tool box which ultimately came from the bathroom medicine cabinet. I would recommend a better set but something is better than nothing
Box cutter. I didn’t have a precision knife set as per recommended in this video and I would highly recommend something like that and I won’t do another repair without it. The box cutter worked but that is a little like using a sledge hammer when all you need is a 16 oz claw hammer. Sure, it gets the job done but makes a bit of a mess of your project surface.
Paper clip in place of a sim card ejection tool.
Heat gun. Mine was probably overkill but it worked fine.
Small Cross-recessed (Phillips) screw driver. The battery kit came with screw drivers but I prefer my nicer, more professional set. Even I can show up to a party in the right outfit from time to time…
The video recommends playing cards but those were chewed up pretty quick on me so I had to use some more ridged cardboard to slide between the battery and the body of the phone. Your mileage may vary. In my case, this “Hello Fresh” junk mail bit worked better than a playing card. Basically, anything ridged that is not so stiff as to crease the battery and cause it to vent with flame.
I used a spudger but not a fancy black nylon one, this one was able to pry and get between the frame and the screen well enough.
Double sided sticky tape to put the lower back panel back on
I also used a dental pick to help with the picking at the device. I recommend something like this for so many of your smaller projects, especially if you have giant sausage fingers.
1 hour of time to devote to the project
The two areas of extra care for this project is removing the glass around the camera and the battery.
The glass needed to be loosened up with the heat gun, gently, as to not over heat the device. Doing so can cause irreparable damage to the device. Once I got this portion heated up enough, the glue started to let go of the glass plate enough to allow me to get that knife in there. This would have been easier with smaller, more precise tools. Thankfully I didn’t break it.
Once the glass and the plastic cover are removed, that will expose the 6 screws holding the device together.
The spudging tool will be required to carefully pull the body of the phone from the screen assembly. I was “fortunate” that this phone has a bit of an area of buckling around the volume button and made it a bit easier to get the case away from the screen assembly.
The phone comes apart and exposes all the little secrets of its design. It also exposed the fact that this thing is incredibly dirty and needed a good cleaning with some isopropyl alcohol.
The other area of concern is the battery. It is imperative you take extra care as to not bend the lithium-polymer battery too much or this will “vent with flame” and it can be a rather spectacular event, one that I was not interested in having. There is a small amount of clearance that will allow you to start prying away at this battery. I carefully used the heat gun to loosen the glue here as well. Once I got the battery up a little, I used several card like things to pry this up.
The rest of the instruction per the video was spot on but the emphasis on the glass and the battery was a bit understated, from my view.
Reassembly of the device was pretty straight forward. It assembles pretty easily. Install the battery, I reused the adhesive pads from the previous battery. Then carefully install the cables, make sure they are fitted well. The glass still had enough adhesive on it that it will keep the glass in place. The bottom plastic bit needed some double-sided tape to keep it in place.
Since this phone wasn’t exactly a “looker” when I started, I am not concerned about how it looks when complete. It is also in a case that will help to hold things in place.
Upgrading the Operating System
This was actually a lot more time consuming than fixing the battery, I am sorry to say. When I started the phone I was greeted with this error about a vendor mismatch.
I have seen this error before so it wasn’t a big deal in fixing this. I downloaded the Google image and flashed the vendor image as per the instructions I found here. In short, here is the process I went through:
Downloaded the nightly ROM and the Gapps (mini Gapps) and put them on your phone
fastboot flash radio radio-angler-angler-03.78.img
fastboot reboot bootloader
Flash the Lineage ROM and the mini Gapps
Wipe cache, reboot
I admit these instructions are not as verbose as I normally give. If you have any issues, please leave a comment or email me and I will take the time to make it more verbose.
After this process, the phone would not recognize SIM for cell service. I tried flashing the radio and vendor image and still, nothing. I used this little trick from here which also didn’t help, it only told me that it didn’t know the IEMI.
That trick is, on the dial pad, Type in *#*#4636#*#* It exposes some very interesting bits of information about your phone.
I reinstalled using these instructions several times. Service mode didn’t provide me any solutions and I feared that I somehow erased the very definition of the phones cell radio identification.
As a kind of last ditch effort, I installed the stock Android image and the cell phone signal miraculously worked again. Installing Lineage OS once again left me with no access to the radio. After some more web crawling, the solution sort of come from this Reddit post that said the issue has something to do with the system lock not releasing the cellular radio to the system.
Sure enough, after disabling all the security features, and rebooting the cell service works once again. The specific issue is with this “Secure start-up” where it requires a bit before starting the system. There is some kind of bug in this that is causing issues. Where exactly, I have no idea.
In order to implement this solution, to disable the “Secure start-up” feature and prevent the SIM from being locked out. Go to, Settings > Security & Privacy > Screen lock. I prefer a pin lock screen and when you do enter your desired pin, you are asked for your “Secure start-up” preference. Say “No”, rebooting the device and the cell service will work normally.
I have installed all the important applications and I am back to full mobile capacity… which… is a pretty short list, really.
Phones today are, frankly, terribly designed. The process to replace the battery is unnecessarily tedious. At this point, I would consider any phone without a user accessible battery a terrible design and I will not purchase another phone that locks away a battery. That signals a design with planned obsolescence. All that does is encourage greater levels of e-waste. I have great hope in the up and coming PinePhone that may not have the performance capabilities of a modern “flag-ship” phone but no matter how much it may lack in processing power, storage, or RAM, it does have a replaceable battery. That means it won’t be a turd of a design that you get from the likes of Apple, Samsung or Huawei.
LineageOS is now a must to have a good Android experience. I tried to go several months on Google-locked Android and frankly, that is not a good experience. The lock down of applications on the phone is terrible. I should be able to remove whatever applications I want. I have reaffirmed that I will not purchase another locked mobile device, newer does not mean better and stock Android is vastly inferior to Lineage OS Android. It’s not even a fair comparison on the significant user improvements the Lineage team puts into Android.
Ultimately, I look forward to the PinePhone. To have an unlocked, user serviceable device that may be a bit less capable on raw performance is a welcome upgrade to just about any mobile phone out there. Give me a headphone jack and access to my battery! I am now done with these mobile nightmare devices.