ABQ Sunport Wifi doesn’t work with Ubuntu Linux – workaround

I have a Lenovo X1 laptop with Ubuntu 22.04 on it, and it will not connect correctly to the Albuquerque (ABQ) airport free WiFi.  I’ve tried lots of things to debug/diagnose this, and nothing got it to work successfully.

However, my android phone does connect successfully via Wifi, so I was able to turn on USB Tethering on the phone and access the wifi via a USB cable and my android phone.

How to limit charging to 80% on a later model year (2014+) Nissan Leaf

Early (2011-2013) United States Nissan Leaf’s had a “limit charging to 80%” feature in the dash.  Nissan removed this feature from US cars after the EPA ruled that if the feature remained, they had to report a lower battery range because on average the car would not be fully charged when it left. [This ruling was incorrect in my opinion…but Nissan removed the feature so that they could advertise the EPA range based upon a 100% charge of the battery.]

I, and many other people, still like to limit charging to only 80% of the battery total capacity in an effort to extend the life of our EV batteries. [This extra effort may or may not be worth the trouble, and many people advise to just charge to 100% and not worry about your battery health, especially for 2016+ vehicles which have a longer battery warranty. But I like to limit charging of Lithium Ion batteries to 80% if I don’t need the extra range.]

If you wanted to only charge your Nissan Leaf to 80% (unless you need extra range for longer trips) on a regular basis, how can you do it on later model year vehicles?

There are three options that I know of:

1. The only way to get a similar effect (keeping the car below 80% SOC most of the time) using only  inbuilt features (i.e. without spending extra money) is to set up the charging timer to charge your vehicle right before you leave. This only works well if you have a regular departure time each day (e.g. for a static work schedule). Then, when you plug in the car, it will not charge until a few hours before your scheduled departure time, aiming to reach 100% about 30 minutes before you depart. This means that the time the battery sits at 100% is minimized. [If you are clever, you can lie about your departure time so that it is reaching 80% about the time you actually leave…]

The downside is that your vehicle is not “ready to go” if you need to leave for an unplanned trip before your regularly schedule departure time, and if you want to charge any other time you have to remember to disable the charging timer so that it will actually charge when you plug in. [My 2015 leaf has an easy to use button for disabling the charge timer…]
Nissan Leaf charging timer disable / off button

So this can be made to work if you have a regular schedule, but it can also be annoying.

2. You can “mimic” the “charge only to 80%” feature by using a Smart / connected EVSE that has a charge limiter built in. For example, I have a JuiceBox, and when I plug in my 2015 leaf, I use the phone app to set the “plugged in percentage” and “stop percentage” (which I just leave at 80%) and it estimates the amount of power needed and will shut off charging after that amount is used. This requires that you pay money for a smart EVSE…but if you haven’t already purchased an EVSE, getting a “smart” or “connected” one with a similar feature will probably only add 1-2 hundred dollars to the purchase price.

Juice Box Pro 40 front faceplate, original silver model from e-motor-works

3. There is also a 3rd party add-on box you can install in your vehicle called Open Vehicle Monitoring System (hardware device) that would allow you to set a charge % limit and also do things like pre-heat in the winter remotely, and has a lot of other logging features…. but it costs $260 (and if you want to be able to use it on a cellular network away from your home/work WiFi networks or the bluetooth range of your phone, you need to include a SIM card with data capabilities which will also probably have a monthly fee)

Open Vehicle Management System Screenshots

Because I already had a JuiceBox, I use method 2…but if I already had a non-smart EVSE, I would probably go with the OVMS route, as it adds other features to the car.
[Especially since the Nissan Connect system in my 2015 leaf no longer works as it used an older 2G cellular service that has since been retired.]  If you have a brand new Nissan Leaf, it probably includes the Nissan Connect service, at least for the first three year of car ownership.

Sphero 2.0 battery replacement

The original batteries in my (8-10 year old) Sphero 2.0 died.

bloating lipo lithium batteries
Once I got the sphere open and removed them, it was clear that they had “bloated”.
They are marked 702035 (7mm thick, 20mm wide, and 35mm long).  However, I don’t recommend buying 702035 batteries to replace them, as the opening they need to go into is closer to 30 or 32mm in length. If I had to do it again, I’d order these 702030 batteries instead.
Continue reading

Reverse Bifocal Trick for Prescription Crafting Glasses


I need optical magnification to work on small crafting projects. However, I also wear prescription lenses, so used a headband based magnifier that I could wear with my glasses. It worked fine, but I didn’t like having to wear two different things on my head, and the forehead mount was a little uncomfortable.

So, I’ve come up with a trick that allows you to order prescription glasses that include a magnifying inset lens. For those of you who wear bifocals…yes, I’m talking about bifocals. By turning the NV (Near Vision) field of a bifocal prescription as high as you can get it, you can get a magnifying bifocal insert of 1.87 X or greater.

The formula that relates optical magnification to dipolars is:

Magnification = (Dipolar / 4) + 1

So with the maximum +3.5 dipolar Near Vision (NV) setting allowed by Zenni Optical, I’m able to get prescription glasses that include a 1.87X magnification inset.

Of course, they are down near the bottom of the field of vision, which works OK for reading in your lap, but not as great if you paint with your elbows on the table like I do.


To move the magnifying areas from the bottom of the glasses to the top, you need to rotate the lenses 180 degrees, AND you need to swap the right and left lens. [So that the bifocal inserts are on the insides, and not moved to the outside of the lens…]

This means that when you ORDER the glasses you must REVERSE or SWAP the OS and OD (Left/Right eye) prescription lines!  Other than swapping for left/right eye, the cylinder and axis numbers don’t need to be changed, as the 180 degree rotation is a perfect no-operation for them!





You also need to order a lens and frame style that is perfectly symmetrical, so that you can fit the lenses back into the frames after you rotate and swap them. I recommend metal frames held together with screws, or rimless models where the lenses bolt directly to the frame pieces. (But watch the mounting holes for symmetry!) Round lenses are usually your best bet, but you could make it work with some of the hex or octagonal lens styles.

I used Rimless Glasses 3229415 from Zenni Optical. If you use my $5 “Refer a friend” link, you get $5 off, and I get $5 towards my next non-standard experimentation with optics (because this wasn’t my first order from Zenni…)

5$ off link: https://bit.ly/3LLPZCX

Alternatively, if you don’t want to hack your glasses, I recommend the headband based magnifier with light in this amazon affiliate link:
https://amzn.to/3xTWRIV

Total cost? This set of glasses only cost me $54 (now that I know what I’m doing) but I did waste another $50 for a different set of bifocals before realizing that the standard bifocal inset area was too low for my needs, and that I’d have to modify the prescription by swapping the left/right eye so that I could rotate and swap the lenses.

Here is a video about the procedure:

 

Lightfield capture to Looking Glass “Quilt” image, scripted on the command line

I set up this still life scene to play with lightfield capture for my looking glass portrait device. Of course it has a few lenses in it so you can see the light go through the different lenses as you move your head back and forth.

To actually capture a “lightfield” you need to take photos of the scene from multiple locations (preferably in a controlled / regular pattern). To do this I put my phone on a skateboard and rolled it across the table from left to right while recording a video. This gives me 30 pictures per second with 1280×1920 resolution.  You can see this as a vertical video on the YouTube “Shorts” platform here: https://youtube.com/shorts/TvIdTJpuWhk

To extract the individual images from the video was a single command line:

ffmpeg -ss 00:00 -i left-to-right.mp4 -t 00:02  out%05d.jpeg

Unfortunately for me, the “input image sequence to make a lightfield hologram” provided by Looking Glass Studio software doesn’t work unless you have a “real” 3D graphics card. The integrated Intel graphics built into my laptop just wouldn’t cut it, so no lightfield magic for me.

BUT, if you can generate a “quilt” image, the Studio software will import that and put it on the Looking Glass Portrait device for you so you can get a 3D hologram from your video.  The trick is to generate the “quilt” image (which is just 48 views tiled into a single image in the exact correct size and format) from your sequence of images.

First, you need to convert each image to the proper aspect ratio (4:3, or 0.75) and size (480×640 pixels).  The command line below uses ImageMagic to do this, and also flips the images upside down (important for when we tile them together, so we can tile a lot of upside down images, and then flip the resulting tiled image to get them in the proper format for LookingGlass….) I’m also putting them into a separate “flipped” directory to preserve the original images.

mogrify -flip -resize 480x640^ -gravity center -extent 480x640 -path ./flipped out*.jpeg

Once we have them properly resized, we tile them into the special Looking Glass Portrait High Res quilt, which is an 8×6 tile (48 images exactly) at 3840 x 3840 pixels.
The leftmost image from the scene (first image in the video) should be at the bottom left, then the images advance across the row and then up the columns until they
end with image number 48 at the top right.

 
# The -tile 8x6 should be obvious
#
# The -gemometry 480x640^ means to make each image 480x640.
# the ^ means  resize the image based on the smallest fitting dimension.
# (redundant here, as they should already be sized correctly by the previous step)
# +0+0 means no border.

montage out000*.jpeg -gravity center  -tile 8x6 -geometry 480x640^+0+0    tempoutput.jpeg

This results in a tiled image with the leftmost image in the top left and the rightmost image in the bottom right…but since we “-flip”ed the images initially, we can now “-flip” this entire output image and re-name it to the proper format for the Looking Glass Studio software to recognize it as a quilt image:

convert -flip tempoutput.jpeg output-qs8x6a0.75.jpeg

 

All that is left to do is to import the quilt image into the Looking Glass Studio software and sync it to your device.

(Or, if you are a beta user of the “blocks” web based embedded hologram service…you can upload it there and then embed the resulting hologram in webpgaes…)

You can download all of my source data and the scripts I used to create the quilt here:  oscar-painting-lightfield.zip

Post Ubuntu 22.04 upgrade fixes

After upgrading from Ubuntu 21.10 to 22.04 on my Lenovo X1 Gen 5, I had the following issues:

1. The built in video player (Totem) would display H.264 AVC videos squished to the far
left of the video window and only black and white. This appears to be a problem with vaapi, as the following command fixed it:

sudo apt remove gstreamer1.0-vaapi

2. Desktop icons disappeared. Had to manually run:

sudo apt install gnome-shell-extension-desktop-icons-ng

3. Firefox (snap) refused to display a file dialog for uploading files to websites (or download any files) until I manually installed:

sudo apt install xdg-desktop-portal-gtk

3. OpenShot video editor 2.6.1 refused to run. This is more of an issue of 2.6.1 not being compatable with the latest desktop that Ubuntu is using, but to fix it is annoying, as OpenShot has not had an official release since 2.6.1 in Sep 2021. You have to use the “Dailies” appImage (or install from the most up to date source yourself) to get a working OpenShot.

 

4. When trying to open a file on the desktop using the “find software” dialog, I got an error message “Failed to start GNOME software”, which was solved by:

sudo apt install gnome-software

5. Thumbnails would not show up for video files in the file browser, until I installed:

sudo apt install ffmpegthumbnailer

 

Fixing my Neato X11 robot vacuum LCD screen

I’ve lived with a blank screen on my Neato X11 vacuum robot for a few years, but recently the robot started to beep error messages and refused to start up correctly, and I couldn’t figure out what the problem was without the screen.

So I found this thread and this specific post and decided that injecting 12 volts to the c5 line would be worth trying (I’m NOT going to go to the effort of replacing the entire LCD, especially if I have to remove polarized sheets and reverse it ;> )

I used a 78L12 12v 100ma linear regulator (TO-92 package) because it was inexpensive and small.

Of course, I added in a lot of hot glue for stress relief….

Thanks to AlainCAN, this fixed my LCD and I can now read the error message (fan was stuck, I found/removed a rice grain and that fixed things right up!).

Unfortunately, I somehow appear to have broken the LED’s (backlight for LCD as well as the button LEDs). I’m not sure if this is related to this throwaway line in Alain’s post:

By the way, don’t forget to replace the C10 capacitor as it can cause trubbles later (dimmed light green led).

Or perhaps I just forgot to plug something in….. but I can read the screen, which is better than having LED’s without being able to read the screen, so I’m going to count it as an overall success.