Caveat: Corporate Epistemological Crisis (Why AT&T, Why?)

I took this screenshot on my phone a few weeks back, but I just now remembered I’d taken it with the intent to share it. AT&T is convinced that my phone is a “3G” phone, and they are trying very, very hard to get me to “upgrade.” The thing is… I don’t think their belief is accurate. See the screenshot from my phone, for clarification.

picture

I’ve dealt with various people in their customer service multiple times, but they are unconvinceable.

picture

Caveat: OGFMirror

[The below is cross-posted from my other blog.]

I’m super bad about posting to this blog. That’s partly because I feel a strong desire to report some actual, positive progress, which I haven’t felt enabled to do.

I have been very busy with HRATE technicalities. I am building – very, very slowly – a “mirror” for opengeofiction.net (OGF). I think if this is successful, then the owner of that site, who has expressed interest in “letting go” of having to continue to maintain it, will allow the mirror to take over for the site and a transition to a new hosting environment will be complete.

Someday, I intend to write up, in elaborate, technical detail, this process of setting up a mirror. But in broad outlines, here is what it involves (has involved, will involve).

  • Build a new Ubuntu 20.04 LTS server. This leads to lots of incompatibilities farther down the line, because the existing OGF server is an older version. Install the basics – apache, postgresql, etc.
  • Install an OSM rails port on the server.
  • Migrate the OGF data to this server. This was very, very hard – because the OGF data (in either .osm.pbf format, or in pg_dump format) proved to contain inconsistencies (data corruption). Some missing current nodes and ways had to be restored manually (text-editing .osm = .xml files). This ended up a 2-weeks-long process.
  • Set up incoming replication from the source apidb (OGF) to the new mirror (currently being called ogfdev).
  • Set up outgoing replication for the new ogfdev instance (to drive render, overpass, etc)
  • Set up a new primary render. This had some sub-parts.
    • coastlines. This proved very difficult, because as far as I can figure out, the osmcoastline tool used to create the coastline shapefiles is broken on Ubuntu 20.04. An older version must be used. My current workaround: I’m actually running coastlines on an older server. I import a coastline-containing pbf file to the older server, run the osmcoastline tool, and post the shapefiles for consumption on the render server.
    • I made a decision to run the renders on a different server than the apidb. I think this might involve a bit more expense, short term, but it makes the whole set of processes more scalable, long term. My experience with Arhet is that the render requires scaling sooner / more frequently than the apidb, as the user base grows. Installing the render software (mod_tile and “renderd”) proved difficult. It turns out that there are some lacunae and downright incorrect steps in the documented installation sequences on github.
    • Set up incoming replication from the ogfdev database to the render database.
    • There are substantial differences in recent versions of the openstreetmap-carto style – specifically, the special shapefiles are no longer stored in as datafiles in data folder in the render directory. Instead, the shapefiles are loaded to the database. Because non-standard shapefiles are used, this means rewriting the load procedures (python scripts) – the standard approach is to just grab the files for “Earth” (because who would run osm for some other planet?!). So that file-grabbing is hard-coded in the procedure.
  • Set up a new topo render. The topo render was shut down on OGF, so this will be the only working version. Unfortunately, I ran into a similar problem with some of the topo pre-processing as I ran into with osmcoastline, above. I suspect for the same reason – something in one of the dependencies they both have. So the topo pre-processing (turning the .hgt files into a contour database) is also being run on a separate, Ubuntu 18.04 server (just like the coastlines).
  • Set up appropriate changes and customizations for the front-facing rails port (osm website). This involves importing user data (done) but also user diaries (not done). These require ad hoc SQL coding that give me flashbacks to my job as DBA in the 2000’s. Another unfinished piece – internationalization. The current ogfmirror website looks okay, but only in English. Switch to another language, and it all reverts to OSM boilerplate. Why is internationalization done so badly on production software of this kind? I see no easy solution except manually editing each language’s .yml file in turn (OSM has a 100+ languages). Or building my own damn application to achieve that result.
  • Set up overpass and overpass-turbo. Overpass installs relatively painlessly, but I’m having trouble getting incoming replication to work correctly. overpass-turbo was quite difficult – the current version on github is flat-out broken, and so an older version (commit) must be compiled and installed. Further, the compilation and configuration process overwrites some of the parameters files, so the parameters files have to be modified after running the first steps of configuration, but before the last part. This is the step I am on right now.
  • Set up nominatim? – nice to have, but not urgent. Anyway nominatim doesn’t work on the existing OGF website
  • Implement some of the custom tools that are available on the OGF website: the “scale helper,” the “coastline helper,”…
  • What else? This is a work in progress…

So I’ve been busy. Here is a link to the site. Bear in mind, if you are reading this in the future, the link may not show you what I’m currently writing about, but rather some future iteration of it.

https://ogfmirror.com

I’m still working on some of those last steps. Open to hearing what else needs to be done.


What I’m listening to right now.

K-os, “Hallelujah.”
picture

Caveat: Digital Anti-vaxxer

I might have just converted to a role that might be called a “digital anti-vaxxer.”

My computer crashed this morning. It was far from catastrophic – I have good back-up habits. I lost a few dozen recent pictures, and some text files I can’t even remember what was in them. I may have lost some other stuff. Further, I have a perfectly good “extra” machine, which I am now using. It’s not as comfortable in its configuration, and will take some time to get used to, but it serves my basic needs fine.

And I knew that the computer in question was sickly – it was my HP “Lemon” laptop I bought in 2018, which has always had a bad battery and long had other issues as well. I have been using it as a desktop computer (because bad battery). I had changed the windows over to Linux. It wasn’t a terrible machine.

I believe the mistake I made, yesterday, was to let alarmism seen on the internet induce me to finally look into installing some anti-virus software on my Linux machine. I did a little bit of looking around and elected something called Clamav.

As background: I have never run anti-virus software on my Linux machines. And frankly, I’ve never had problems with viruses or malware on my Linux machines. Even on my Windows computers, when I’ve had and used them, I have never installed any kind of paid anti-virus, though for Windows machines I’ve occasionally run “system scan” or system monitors of various kinds.

I had always felt that with respect to anti-virus software, the cure was worse than the disease. But with respect to the principle, I could see where people who had less comfort and familiarity with the inner-workings of computers might have a reasonable use for anti-virus software. Or, barring that, they can get Apple products, which has the anti-virus buried inside it at such a level that it’s invisible and relatively non-disruptive to the user.

Anyway, back to my narrative: I installed the Linux anti-virus software on my computer yesterday. And, having never had a major problem with my Lemon’s software (only ever with hardware, before), this morning, I found the machine was “bricked” – this is a term used to describe computers or smartphones that simply cease, utterly, to work. Black screen, no boot, that kind of thing.

The only thing that I did different was install that antivirus software. So my conclusion: the cure was indeed much, much worse than the disease. And I am an ever more committed digital anti-vaxxer than ever before.

Which feels odd, since I’m not an anti-vaxxer with respect to human vaccines – which is a big deal these days. I know lots of people who are anti-vaxxers in the human realm. Those people befuddle me. So I suppose to the typical loyal consumer of anti-virus software, I must seem equally befuddling.
picture

Caveat: Postgresql blues

I have been putting a lot of energy, this last week or so, in trying to scope out a new project related to my “map server,” which I’ve mentioned here often in the past. Really what I’m trying to do is create a space for a viable “mirror” of the main geofiction site where I first started this online map-drawing hobby while convalescing from my cancer surgery in 2013. That site is suffering performance issues and the owner of the site is too busy and disinterested in proper maintenance, and so the user community (about 200 active users) is concerned that the site will just “go down” one of these days without recourse for the users, and with a loss of all the creative work that’s been done there.

To build a mirror, I need to handle a much larger data-set than I do for my own little, previously-mentioned map server. And I have been wrestling with the database application used by the map server software, that goes under the brand name “PostgreSQL,” trying to get my development server to handle the much larger data-set. For comparison, my Arhet map server’s backup file is about 25MB. The opengeofiction map server’s backup file is 850MB. That’s a 34x increase in size.

So I tweak various running parameters for the database and the data-loading tool, called osm2pgsql, in hopes of getting it to work. So far, there is definitely a failure point at around 250MB. I spend a lot of time staring at the database monitoring screen on the server, trying to see what the point of failure is.
picture

picture

Caveat: 4G or not 4G – that is a question

I spent over an hour on the phone with various representatives of AT&T Wireless. I had made a somewhat belated decision to try to set up my voicemail on my phone – now that I’ve had an AT&T plan for almost exactly 3 years, it seemed like the right time to set up voicemail, right?

Once I got to talking to the right person – the fast-talking but competent and sincere Isabella of the Philippine Islands – we got my voicemail working. I guess if you don’t set up your voicemail right away when you get a new phone plan, they assume you don’t want it, and deactivate it. So she had to activate it – which was more complicated than seems entirely necessary.

So now I have voicemail. That’s useful, maybe. We’ll see who wants to leave me messages.

But, meanwhile, there was a very strange issue. AT&T is going to be ending their support for “3G” cellphone signals. Since most people have 4G or 5G, this makes some sense – why keep supporting the old technology when most of their customers have moved on?

The problem was that two of the representatives, including Isabella, where quite aggressive in warning me that my phone would no longer be supported once the 3G service was “sunset” later this summer. “Sunset” is the term they use in corporate-speak for ending a service. And yet… there at the top of my phone, it says, “4G.” They simply insisted it couldn’t be true that my phone supported 4G, because their records showed otherwise. “I think your records are incorrect,” I said. “I’m talking to you on a 4G connection right now.”

“No, that can’t be… you’ll need to upgrade your phone.”

Well anyway, color me skeptical. I just think they’re wrong. Here is the screenshot from my phone. See there, in the uppermost right?

picture

picture

Caveat: raggedsign

Contrary to superficial appearances on this here blog, I’ve not been a layabout, in recent weeks. I’ve been quite productive in the sphere of website building and administrative work.
This spurt of productivity was impelled by a request from the owners of the gift shop, where I work part-time. They wanted me to build them an improved website for their “other business” – a cabin rental business for tourists in Klawock.
That website is now “live” and running well, hosted on one of my servers – same as this blog and all my other various web projects. You can visit that website: aplacetostayinak.com.
This work has led to a whole host of ancillary projects, as I try to clean up and update my several servers. I felt that if I was actually going to start being paid for what has so long been a hobby, I should get my proverbial ducks in a row.
By far the most difficult thing I’ve done wasn’t building that new website, but rather it was rebuilding, from the bottom up (i.e. from bare-bones, brand new “blank” server) my “map server,” which I’ve mentioned many times here. This has been necessary since my giant server crash a few months ago, and having the old server running, with all its problems and wasted space, was very inefficient. By doing this, I could free up a lot of space for new projects without shelling out for another new server. It was quite a job, and I’m proud of the outcome, though it’s the least glamorous, since in fact the objective was to get it looking and behaving exactly as the old map server. So if you go to my map server, at its new address, you’ll see something exactly the same as my old map server (which I have now shut down). The new map server is: arhet.rent-a-planet.com.
Another difficult thing I accomplished is that I have finally built my own email server – after many years of wanting to. Nothing will change as far as reaching out to me. I haven’t “killed” any of my existing email addresses, and my gmail one remains my “primary.” But having my own email server simplifies website administration and hosting substantially – a website server produces a number of automated, administrative emails, in the vein of responses to “Lost your password?” queries or “Server backup job completed at 07:00 AKDT”.  It is actually pretty hard to get such emails to go out correctly when you don’t control your own email host. So I built one. I placed it on one of my many domains: craig-alaska.net.
As a side note, therefore, if anyone who knows me wants a customized email address, I now have the ability to provide that. The email server includes a “webmail” interface, so if you really wanted to, and trusted me enough, you could throw away your gmail account and be fulano@craig-alaska.net (or any of my other domains, or your own if you want to buy one).
I also set up a blog for a neighbor and good friend of Arthur’s, Jeff. He hasn’t done much with it, but I’m going to be providing him with some orientation so he can get his blog started: akjeff.com
Having done all that, and thinking about the fact that I am earning money from a few of these web programming adventures (though not at all breaking even, yet), I decided it was time to declare my web design and hosting “business” in some kind of official way. So I built yet another website, which is my “business” – such as as stands. Currently the income is less than the cost of the servers I have. Not to mention the programming time is, so far, “free.” I’m doing it as a hobby, I guess, but if I’m going to be making some money with it, I might as well try to look professional.
That new website: raggedsign.com. I would welcome feedback on appearance and text – it’s quite rudimentary and “first draft,” right now.
“raggedsign” is a name I came up with in around 2001 or 2002, as a kind of “brand name” for my efforts at learning website design and web programming. It went into extended dormancy during my decade in Korea and I only recently decided to resume using it for the same, original purpose. I have also used the brand-name “general semiotics” for my computer-related work, specifically my year and a bit as an independent “database design consultant” in 2006-07. I still own that domain, too, and for now I’ve redirected generalsemiotics.net to the raggedsign site.
My next project is to provide a new “Topo layer” for the opengeofiction.net site where I am still active, bearing an informal “administrator emeritus” title. The previous “Topo layer” for that site was deactivated due to performance issues, but I have always been one of its biggest users and fans. So rather than complain to the other admin people on that site about the now-missing topo layer, I thought I’ll take on hosting one, myself – if I can. There are some technical hurdles to be overcome. But I think I’ll manage it.
picture

Caveat: Tree #837

This tree oversaw the greening of a huckleberry bush.
picture
Meanwhile, in my garden, there was some critter that had been breaking into the greenhouse at night and eating all my newly-planted radish seeds. So I borrowed a couple of Art’s mousetraps and set them in the planters. And lo, this morning, a very fat-looking dead mouse was caught in one of the traps.
picture
Meanwhile, what with it raining today, I went down a kind of rabbit-hole on my server stuff. I built an email server. I’m not sure it will really work, or even prove useful. But it might – a lot of the problems I’ve run into with developing my own websites has been a lack of an email system that I fully control. So maybe it will work out.
picture[daily log: walking, 2km]

Caveat: Font Fail

[The below is cross-posted from my very sparsely-populated other blog.]
I have taken some steps to migrate one of my major geofictions – The Ardisphere – from OGF to my self-hosted OGFish clone, Arhet. The reason for this is that OGF seems increasingly rudderless and destined to eventually crash and burn, and I am emulating the proverbial rat on the sinking ship. I still hugely value the community there. But the backups have become unreliable, the topo layer (of which I was one of the main and most expert users) has been indefinitely disabled, and conceptual space for innovation remains unavailable.
One small problem that I’ve run up against in migrating The Ardisphere to Arhet is that I discovered that Korean characters were not being supported correctly by the main Arhet map render, called arhet-carto. This is a problem because the Ardisphere is a multilingual polity, and Korean (dubbed Gohangukian) is one of the major languages in use, second only to the country’s lingua-franca, Spanish (dubbed Castellanese). I spent nearly two days trying to repair this Korean font problem. I think I have been successful. I had to manually re-install the Google noto set of fonts – noto is notorious (get it?) for being the most exhaustive font collection freely available. I don’t get why the original install failed to get everything – I suspect it’s an Ubuntu (linux) package maintenance problem, rather than anything directly related to the render engine (called renderd, and discussed in other, long-ago entries on this sparsely-edited blog).
Here (below) are before-and-after screenshot details of a specific city name that showed the problem: Villa Constitución (헌법시) is the capital and largest city in The Ardisphere. Ignore the weird border-artifacts behind the name on these map fragments – the city is in limbo, right now, as I was re-creating it and it got stuck in an unfinished state.
Before – you can see the Korean writing (hangul) is “scattered”:
picture
After – now the hangul is properly-composited:
picture
You can see The Ardisphere on Arhet here – and note that within the Arhet webpage you can switch layers to OGF and see it there too. Same country, different planets!
What I’m listening to right now.

Attack Attack! “Brachyura Bombshell”.
picture

Caveat: Yet Another Map Server

I took the first steps over this past weekend and today to publish a new map server website. Here is a screenshot of its current status in my browser.
picture
This is, maybe, the fourth map server I’ve built from scratch using the OSM (openstreetmap.org) architecture, which is open source.
Partly, I do this because I keep wanting to practice and get better at putting them together. There is quite a bit of code customization required to get the server working on a “non-earth” map. So it’s good to keep in practice.
But also, if this one works adequately, I intend to migrate my Arhet map server (which has 20 or so active users, a few of whom have given me a small amount of money) to a newly built map server, fully segregated from all my various blogs and such. All this time, Arhet as existed at the timorously-named test.geofictician.net, and shared server space with multiple other applications, including this here blog thingy, a wiki, a MUD, and sundry web-doohickeys.
So first, I’m doing this practice run, using another planet, Rahet. The similarity between the names “Arhet” and “Rahet” is not coincidental: they are both anagrams of “Earth”. I intend for all my map servers to have such names.
picture

Caveat: The Terrible mysql Crash of 2021

I still don’t know how it happened. I somewhat suspect I got hacked, somehow … I found strange and unexpected Chinese IP addresses in my mysql error log. But I don’t understand mysql back end or admin well enough to know for sure what was going on.
I was able to restore a full-server backup to a new server instance, and have re-enabled the mysql-driven websites (my 2 blogs, my wiki, etc.) on the new instance. Meanwhile, I somewhat stupidly reactivated the non-mysql website (the geofictician OSM-style mapping site, the so-called “rails port”) on the old server instance. The consequence of that is that I am now stuck with a two-server configuration where I had a single server configuration before. I think in the long run I’ll want to isolate ALL my mysql-based sites to a single server, and ALL my non-mysql-based sites to another single server. That’s going to take a lot of shuffling things around, which is not trivial.
For now this blog (and my other blog) seems healthy and up-and-running, again.
There may be more downtime ahead as I try to reconfigure things more logically, however.
[This entry cross-posted from my other blog.]
picture

Caveat: Server down and downer…

My server crashed sometime early this morning.
I don’t know why. Specifically, some kind of fatal database error, on the mysql database used to back up all the blogs (like this one) and several other important applications.
I have successfully restored the blog – I’ve relocated it, using a backup file, to another server.
But all the other things running on the server: my mapping application (OSM-style GIS for geofiction), my other blog, my MUD, some development work – all those other things are still missing in action.
I have a lot of work ahead of me, trying to rebuild this stuff.
picture

Caveat: Purple Screen of Death

I’ve been feeling a bit out of sorts, lately – hard to pinpoint why.
So I decided to plunge myself into computer issues. Perhaps there’s something of my uncle in me, right? I started trying to build a “development box” using my old laptop that brought with me from Korea. It’s not (and never was) a very good computer. But I’m not looking for performance, here – just a separate machine where I can try to run things without messing up my main computer (which is the HP “Lemon” I bought in 2018 – a laptop, too, but with a useless battery and some other issues, but which I have repurposed as a desktop Linux computer and works fine as that).
The Korean laptop is a 2009 “XNote” – whatever brand that is. It had been running “Windows 7 Korean”, which was a hassle because Microsoft doesn’t let you simply change languages in an operating system: you have to pay them first, as if you were buying a new operating system. This is true despite the fact that the data to support such a change is already inside the computer. So for all those years, I had to cope with error messages and applications running in Korean. I suppose it was a good way to learn some Korean, but it was stressful when you have to get something done and you get an error message and you have to break out the dictionary to figure out what’s wrong.
Anyway, I had set up linux (ubuntu 18.04) a few months ago, deleting the Korean Windows altogether. So now I ambitiously set out to replicate on this little laptop the same configuration I run on my server (the one that lives in a California “server farm” where this blog and all my mapping websites live). This is possible, as long as one isn’t concerned about speed and performance issues.
But I messed something up. I was trying to install Ruby – a programming language environment used for some of the mapping website software – and got stuck on a permissions problem. These are very common in linux, which has a pretty arcane and strict system of file permissions. In trying to repair that problem, I broke the operating system – certain files require certain permissions, or the whole apparatus comes tumbling down. I lost the ability to run root-level commands (called “sudo” in ubuntu) and furthermore, on reboot, the system hung before fully loading. End of operating system.
Microsoft’s Windows was famous for many, many years for presenting a “Blue Screen of Death” when it crashed. This was called the BSOD, and was more common than anyone liked. Well Ubuntu linux has its own BSOD, except it’s more a dark purple rather than blue. And it’s even less informative than Microsoft’s version.
So I had to start over. Tomorrow I go to work. I might not now make progress on this project until Thursday or Friday.
picture

Caveat: Round and round

[NOTE: the following is cross-posted from my other blog – just putting it here to show what my “other online persona” looks like.]
I ran across a small, free website that someone made that transforms a flat map of an imaginary planet into a globe that you can rotate with the mouse or that can be used to generate a “spinning world” gif. It’s called maptoglobe.com.
I decided I wanted to make one for my planet, Arhet – just out of curiosity. This did have a few minor technical challenges. First, I had to “knit” together the tile images for Arhet. I found a nice utility that does this, an application called tile-stitch by Eric Fischer. It can be found on github. Except for one small problem, I just followed the documentation provided on the github README. That one problem: to get it to work in my machine, I needed to modify the code in the stitch.c file to include the full path to the geotiff utilities. So…
Original code:...
#include <geotiffio.h>
#include <xtiffio.h>
...

My version:...
#include </usr/include/geotiff/geotiffio.h>
#include </usr/include/geotiff/xtiffio.h>
...

Once that was set up, I simply extracted the tiles at zoom level 5 from the Arhet2-carto render using the tile-stitch utility, with this command
./stitch -o arhet5.png -- -85.05 -179.99 85.05 179.99 5 https://tiles01.rent-a-planet.com/arhet2-carto/{z}/{x}/{y}.png

That got the whole planet into a square .png file, which I called arhet5.png.
The next problem is that the maptoglobe website requires the map image to be in a equirectangular projection. But the tiles for Arhet are in the modified mercator projection used by almost all online “slippy maps,” classified as EPSG:3857.
So the arhet5.png file was in the wrong projection. I found out I could use another utility that I already had, the gdal library, to do this job. I ran the following commands.
/usr/bin/gdal_translate -of Gtiff -co "tfw=yes" -a_ullr -20037508.3427892 20036051.9193368 20037508.3427892 -20036051.9193368 -a_srs "EPSG:3857" "arhet5.png" "arhet5_tfw.tiff"
/usr/bin/gdalwarp -s_srs EPSG:3857 -t_srs EPSG:4326 -ts 6400 3200 "arhet5_tfw.tiff" "arhet5.tif"

These produced a .tif file in the right projection, 6400 x 3200 pixels. I then opened this file and resaved as .png again (because this is a more compact format that is therefore uploadable to maptoglobe.com – which has a maximum file size limit).
I then uploaded that .png file to the maptoglobe site, and it allowed me to save the resulting “globe” – it’s accessible here. Further, I was able to make this nice little spinning planet gif:

That’s the planet Arhet, as it currently stands – note that most of the mapping there is not my own, but the work of the various other Arhet members who have joined me in my experiment.
That worked out so well that I did the same thing for my own private planet, Rahet (note that the names Arhet and Rahet are obviously related; Rahet came first, and when I decided to change the project and invite other participants, I renamed the old Rahet as Arhet, and then resurrected the old Rahet later and as a separate project again).
Here is a the link for Rahet on the maptoglobe site, and here is the spinning planet gif:

So those are pretty cool. Remember that the original “slippy maps” (HRATEs) of these two projects are on the map portion of this website, here and here.
picture

Caveat: mutely mute

I had a problem with my new phone that I might have solved. My diagnosis isn’t 100% – I could have misunderstood what I figured out. But it was puzzling, and I was unable to find any clear description or solution in online searches, so I thought I would provide my experience for future googlesearchers.
The new phone I bought is a Blackview BV5500. This is a Chinese knock-off brand – I bought it because I wanted something cheap, and I figured I could sacrifice on matters of quality for now. For the most part, Android-platformed smartphones are so commodified at this point that there isn’t much difference between the many different models and makes. Still, in terms of those sacrifices, I would say the most noticeable is battery life. While my 4-year-old Samsung Galaxy 7 still had an amazing battery life (about 36 hours at regular usage levels) and superfast recharging (full recharge from 2% battery in about 90 minutes), this new phone seems to have about 6-8 hours life at regular usage levels and recharging is quite slow. Anyway. That’s the difference between an $800 sticker price and a $200 sticker price.
The other issue I have is what you might call UI design – not at the Android level (operating system) but at the physical device level. There are only two buttons, and they are placed closely together on the right edge. I really valued the “home” button on the bottom front of my old Samsung.
Where this UI problem came to fore, however, was in the problem I had yesterday and today. Somehow, yesterday, my phone’s basic “phone call” function became mute. That is, I could place calls, but I could neither receive nor transmit sound. I kept testing this, over and over, by calling the house phone (landline) here. The calls were connecting, the landline would ring, but there was no sound on the smartphone. The speaker, and the “speakerphone” speaker (a different speaker), and headphones, and mic, were all mute. But there was nothing in the settings to indicate that anything was muted, no icon, no control, and call volume was set to normal. It was like the speaker and mic had simply been turned off. But it was only for making “regular” calls. Skype calling worked fine. Other media applications worked fine.
The best I could find online was some hint that there was a mute function that could be invoked by pushing both the buttons on the side at once. This was not included in the documentation that came with the phone. And I kept pushing those two buttons, but it wasn’t seeming to change the behavior of the phone-calling application.
I tried so many things. I installed a separate “dialer app” – but its behavior was no different from the native app. I reinstalled a bunch of stuff. I did a full factory reset of the phone. No luck. So not only was the Blackview BV5500’s phone calling app unable to make sound – mute – it was mute about it its muteness, so-to-speak.
I finally got lucky – I pushed the two buttons at once while I was in the process of attempting a phone call. Suddenly, it was working fine.
My hypothesis, based on this behavior, is that the “mute” function invoked by pushing those two buttons at once is “hardware-based.” It doesn’t reside in the operating system – that’s why the factory reset didn’t help. But that “mute” function is only accessible when a call is in progress. The device is “hardware-aware” of that – which makes sense. So the only way to “push the button again” is to do so while a call is in progress.
I could be wrong about this. I was messing with a lot of settings trying to find one that would make a difference, and I wasn’t systematically testing between each little adjustment. But my hypothesis is the only one that makes sense – both in how the problem arose (it arose when fat-fingering the phone to make a call while trying to do something else at the same time), and in how it finally resolved.
I’m mostly writing this for is someone tries to google this problem with their Blackview phone in the future, that they might have a possible solution.
I will now return you to your regularly-scheduled tree / poem / banality.
picture

Caveat: imaginary map servers for rent

I have been pretty busy with computer stuff over the last few days.
That is because something new happened. For at least three years, now, I have imagined there might be a path to turning my eccentric computer-based geofiction hobby into some kind of business. Well, I officially have a first customer. I won’t say anything about that person – they may wish to preserve anonymity. But the concept is that they want their own, private “imaginary planet map server” in the style of the real world’s OpenStreetMap or Google Maps. These already exist. OpenGeofiction (“OGF”) is the most popular imaginary one, where I have been an active participant since early 2014. And in 2018 I began my own project, Arhet.
I like to call these “imaginary slippy maps” HRATE‘s: “High Resolution Alternatives to Earth.”
It has seemed to me there might be demand for these things. Geofiction isn’t exactly a popular hobby, but there are several hundred users at OpenGeofiction, and there are websites and communities dedicated to it, including the active reddit r/imaginarymaps. Further, if Hollywood is willing to pay linguists big bucks to create imaginary languages for their stories (e.g. Klingon from Star Trek, Dothraki from Game of Thrones), there might also be creators of large, mass-market fantasy or sci-fi who are also willing to pay money for professional-grade “slippy maps” of imaginary places. The current extant efforts at such things are depressingly amateurish, e.g. this map of Westeros.
pictureA few months ago, I had put out to the OGF community, in a very low-key way, that I would be willing to do the technical work and provide ongoing server hosting and administration to anyone willing to pay a minimum monthly amount on my Patreon account. Patreon is a website used by “creators” (musical performers, programmers, writers, visual artists, etc.) to provide a kind of “pay what you think it’s worth” tool for their fans and customers. On my Patreon account user page, I’d made explicit the concept, as you can see at this link (screenshot at right).
On Monday, someone reached out to me and said they were interested. So I promptly “spun up” a new geofiction server and gave them a log on username for it.
This is not trivial work, however.
I’m using the OpenStreetMap software platform – because it’s free and open source.
But it requires an Ubuntu Linux server (I rent my servers from a company called Linode, since they specialize in Linux servers). My servers live on server farms in California and New Jersey. They are not that expensive – the $20/month rate I set up on Patreon will cover the rental fee for a small server.
Building and running a Linux server from scratch is pretty involved, if it’s to be for a specialized application like a GIS map server (GIS means “Geographic Information Systems”).
I have to install databases (plural!), Apache (the webpage controller), the so-called Rails Port (the website software behind OpenStreetMap, OpenGeofiction, or Arhet), a rendering engine (part of the OpenStreetMap architecture but not integrated to Rails Port). Several of these pieces need customized bits of programming code changes to accommodate a function not in their original design – i.e. hosting an imaginary, non-Earth planet map. Several aspects of the OpenStreetMap platform “hard code” real-world facts and data – because the designers simply never imagined the idea that someone would be using the platform to present non-real-world data. I have to remove code references and datafiles related to Earth’s coastlines, for example, and develop alternate ways to extract that information from the planet database and generate those same datafiles in the correct format. Etc., etc.
Anyway, I’ve got my customer’s planet up and running, including a nicely mapped chain of islands, that the customer asked me to import from their work on OGF. I’m feeling pleased with this. If I get 2 more such customers, I’ll be making enough margin (over and above server rental costs) to support my other tech requirements, such as hosting this here blog. I will not link to this new server I just built, however, since they deserve to have a say in how I publicize their work.
I doubt very much this would ever be a way to make an actual living. But it’s nice to imagine that this hobby could be turned into a supplemental source of income. So maybe it wouldn’t be a “living,” but it would be sufficient to pay for the geofiction hobby and for several other internet-based hobbies that involve money, like this blog. For now, I bought some beer. We shall see.
Meanwhile, as I often say on the OpenGeofiction site: happy mapping.
picture

Caveat: On random, square canyons

I’ve been trying to solve a strange programming problem on my map server. I have these files containing contour data (elevation data) for my fictional places. Because of the way this information is processed by the openstreetmap platform, I store this data split up into files divided along longitude and latitude lines. But that means there are boundary conditions between the files. When I use the specially-created terrain conversion tools on these files, I seem to often get strange “canyons” along the longitude and latitude lines. They look like this on the map’s contour render.
picture
So I have spent a few days trying to find the right set of parameters for the data conversion programs (developed by the founder of the opengeofiction website) to prevent these artificial canyons from appearing. It seems to be a bit of a hit or miss proposition. I’ve got it looking good now, but I’m still not sure quite what the issue is or how to systematically avoid it.
picture
picture

Caveat: Deepfake Presidents Saying Bad Raps

“Deepfake” refers to the emergent art of digitally creating completely artificial video or audio, using AI (artificially intelligent) networks, to simulate real people. The quality of computer graphical animation is at such a level that it is possible to do this, now. You can make your own audio or video of people doing things they never really did, which is indistinguishable from real audio or video recordings.
Someone recently made a rerecording of NWA’s “Fuck Tha Police,” a classic hip hop song from 1999. But instead of the original artists’ voices, they’ve used Deepfake simulations of 6 famous presidents’ voices.
I find this entertaining and eerie.

Six U.S. Presidents (Speech Synthesis), “Fuck Tha Police” (rap by N.W.A.).
Lyrics.

“Right about now, N.W.A. court is in full effect
Judge Dre presiding
In the case of N.W.A. vs. the Police Department
Prosecuting attorneys are: MC Ren, Ice Cube
And Eazy motherfuckin’ E”
“Order, order, order
Ice Cube, take the motherfuckin’ stand
Do you swear to tell the truth, the whole truth
And nothin’ but the truth so help your black ass?”
“You god damn right!”
“Well, won’t you tell everybody what the fuck you gotta say?”
Fuck the police comin’ straight from the underground
A young nigga got it bad ’cause I’m brown
And not the other color so police think
They have the authority to kill a minority
Fuck that shit, ’cause I ain’t the one
For a punk motherfucker with a badge and a gun
To be beatin’ on, and thrown in jail
We can go toe to toe in the middle of a cell
Fuckin’ with me ’cause I’m a teenager
With a little bit of gold and a pager
Searchin’ my car, lookin’ for the product
Thinkin’ every nigga is sellin’ narcotics
You’d rather see, me in the pen
Than me and Lorenzo rollin’ in a Benz-o
Beat a police out of shape
And when I’m finished, bring the yellow tape
To tape off the scene of the slaughter
Still gettin’ swoll off bread and water
I don’t know if they fags or what
Search a nigga down, and grabbin’ his nuts
And on the other hand, without a gun they can’t get none
But don’t let it be a black and a white one
‘Cause they’ll slam ya down to the street top
Black police showin’ out for the white cop
Ice Cube will swarm
On any motherfucker in a blue uniform
Just ’cause I’m from the CPT
Punk police are afraid of me!
Huh, a young nigga on the warpath
And when I’m finished, it’s gonna be a bloodbath
Of cops, dyin’ in L.A.
Yo Dre, I got somethin’ to say
Fuck the police
Fuck the police
Fuck the police
Fuck the Police
“Example of scene one”
“Pull your god damn ass over right now”
“Aww shit, now what the fuck you pullin’ me over for?”
“‘Cause I feel like it!
Just sit your ass on the curb and shut the fuck up”
“Man, fuck this shit”
“Aight, smart ass, I’m takin’ your black ass to jail!”
“MC Ren, will you please give your testimony
To the jury about this fucked up incident?”
Fuck the police and Ren said it with authority
Because the niggas on the street is a majority
A gang, is with whoever I’m steppin’
And the motherfuckin’ weapon is kept in
A stash box, for the so-called law
Wishin’ Ren was a nigga that they never saw
Lights start flashin’ behind me
But they’re scared of a nigga so they mace me to blind me
But that shit don’t work, I just laugh
Because it gives ’em a hint, not to step in my path
For police, I’m sayin, “Fuck you, punk!”
Readin’ my rights and shit, it’s all junk
Pullin’ out a silly club, so you stand
With a fake-ass badge and a gun in your hand
But take off the gun so you can see what’s up
And we’ll go at it, punk, and I’ma fuck you up!
Make you think I’ma kick your ass
But drop your gat, and Ren’s gonna blast
I’m sneaky as fuck when it comes to crime
But I’m a smoke ’em now and not next time
Smoke any motherfucker that sweats me
Or any asshole that threatens me
I’m a sniper with a hell of a scope
Takin’ out a cop or two, they can’t cope with me
The motherfuckin’ villain that’s mad
With potential to get bad as fuck
So I’ma turn it around
Put in my clip, yo, and this is the sound
Yeah, somethin’ like that
But it all depends on the size of the gat
Takin’ out a police would make my day
But a nigga like Ren don’t give a fuck to say
Fuck the police
Fuck the police
Fuck the police
Fuck the Police
“Yeah man, what you need?”
“Police, open now”
“Aww shit”
“We have a warrant for Eazy-E’s arrest
Get down and put your hands up where I can see ’em”
“What the fuck did I do, man, what did I do?”
“Just shut the fuck up
And get your motherfuckin’ ass on the floor”
“But I didn’t do shit”
“Man, just shut the fuck up!”
“Eazy-E, won’t you step up to the stand
And tell the jury how you feel about this bullshit?”
I’m tired of the motherfuckin’ jackin’
Sweatin’ my gang, while I’m chillin’ in the shack, and
Shinin’ the light in my face, and for what?
Maybe it’s because I kick so much butt
I kick ass, or maybe ’cause I blast
On a stupid-assed nigga when I’m playin’ with the trigger
Of an Uzi or an AK
‘Cause the police always got somethin’ stupid to say
They put out my picture with silence
‘Cause my identity by itself causes violence
The E with the criminal behavior
Yeah, I’m a gangsta, but still I got flavor
Without a gun and a badge, what do ya got?
A sucker in a uniform waitin’ to get shot
By me or another nigga
And with a gat it don’t matter if he’s smaller or bigger
(Size ain’t shit, he’s from the old school, fool)
And as you all know, E’s here to rule
Whenever I’m rollin’, keep lookin’ in the mirror
And ears on cue, yo, so I can hear a
Dumb motherfucker with a gun
And if I’m rollin’ off the 8, he’ll be the one
That I take out, and then get away
While I’m drivin’ off laughin’, this is what I’ll say
Fuck the police
Fuck the police
Fuck the police
Fuck the Police
“The verdict
The jury has found you guilty of being a redneck
White bread, chicken shit motherfucker”
“But wait, that’s a lie!
That’s a god damn lie!”
“Get him out of here!”
“I want justice!”
“Get him the fuck out my face!”
“I want justice!”
“Out, right now!”
“Fuck you, you black motherfuckers!”
Fuck the police
Fuck the police
Fuck the police

picture

Caveat: too many planets

I have been busy with trying new stuff and experimenting with my GIS server.
I now have 8 distinct views of 4 distinct “planets” running on the server. Only one of those planets is real – that’s Earth, of course. Included there for comparison purposes.
I have been learning a lot about some new aspects of GIS systems admin under the OSM architecture. That’s good I guess. Good to learn new things.
picture
picture

Caveat: logs and lettuces and loopy isolines

I worked on my firewood collection for a while in the morning.
picture
I saw some lettuces growing nicely in my greenhouse.
picture
I created a really messed-up topo map on my server. Something went wrong with the algorithm. I later learned it had to do with not deleting some temporary files left over from a previous run of the same program.
picture
Another day in my moss-covered, misanthrope’s paradise.
picture

Caveat: Rendering Rocks and Trees

I spend a rainy afternoon making some pleasing and surprising progress on my “map server” architecture that is one of my chief hobbies.
One thing I want to be able to do is to eventually create and host my own “contour” (elevation) data for my geofictional places. Currently, this contour work is hosted at the OGF website, e.g. my island called Tárrases (link to the contour map). I want to be able to host this type of map on my own server.
It’s quite intricate to use raw GIS (geographic information services) data to “draw” one’s own digital contour maps.
As a first step, I have imported the raw data for a small corner of my home in Southeast Alaska to my server. This is digitized planetary height data, freely available from the NASA website. After nearly 3 months of on-and-off effort, I have finally managed to render (draw) the contour map, using the OpenTopoMap architecture (a Mapnik render architecture). Note that this really is only contour data – I didn’t import the other real-world map data, and in fact only placed a few of my own invented towns, such as the towns called “Rock” and “Tree” on the map. The town of “Rock” is out at northern tip of Noyes Island, where Arthur likes to go fishing. It’s a rocky little cape. The town of “Tree” is my home, of course.
Here is a link to the current server, but I’ll include a screenshot below, since the link might end up evolving or changing as I continue refining this effort. You can click that screenshot to enlarge it.
picture
picture

Caveat: https://

Probably most of you won’t even notice, but as of about 3 pm this afternoon, Alaska Daylight Time, this here blog moved from “http” to “https”. This is a substantial accomplishment, that has taken me two years to get around to doing. I had things running well, and so was afraid to mess with it – as they say, “If it ain’t broke, don’t fix it.”
But I finally bit the bullet, in the wake of my frustrating experience with my other server over the last few days. In essence, the work on that server functioned as a “practice run” for what I needed to do.
Why make this change? Well, “http” is being phased out in favor of “https”, all over the online world. The latter is “more secure” by some standard I actually don’t really understand, but it’s become the norm for well run and “safe” websites, so as long as caveatdumptruck.com remained on “http”, it was in danger of ultimately being ostracized from the respectable part of the internet. When I left my prior blog host in 2018, it was because they were forcing me to move to “https”, but at that time I wasn’t ready. So now I finally got ready, and did it. It’s not that I disagreed with their wanting me to move, I disagreed with their having taken the decision for me, without consultation.
If done correctly, the migration should have zero impact on the user experience. The one casualty was the free little “flag counter” I’d managed to find – I’ll have to find another that’s “https” compatible.
Happy web surfing.
picture

Caveat: 8 hours of sysadmin annoyance

I had a bit of an annoying two days. Yesterday, I was pursuing my hobby of building websites, by trying to build out a prototype of a website for a friend. An early step in this process is creating a subdomain on one of my domains (e.g. caveatdumptruck.com is this blog’s domain, and something like “blog.caveatdumptruck.com” would be a subdomain). This should be an easy step, but it’s crucial because you don’t want to mix a new website up with any existing websites on a given domain.
Instead, somehow the versions of some of the software running my web server got “out of sync” – that’s my best guess as to what happened. There’s a lot of software on the “back end” of a website: apache (the “web server”), php (the thing that makes webpages “interactive”), wordpress (the blog publishing tool that can also be used to build not-so-bloggy websites of various sorts), certbot (the free certificate registrar that makes websites “secure” so your browser doesn’t give you alarming notices about bad guys), etc. All of these have to talk to each other. And if they have different versions, they might stop understanding each other.
So something was bad. And my whole server ended up down and all websites on it inaccessible. I spent 8 hours today uninstalling and reinstalling various bits and pieces, trying to get everything in sync again. It was really above my competency. So frustrating, because when I finally got it working, I’m not even sure how I did it.
So a 15 minute task took two days, with 8 hours this morning desperately trying to get my server up and running – this is not the main server, so my blog and map server weren’t down, but my pictures are hosted on this secondary server, as well as several sites I’m running for friends.
picture

Caveat: rootless

Sometimes I work on my server. I have been trying to automate the map-rendering job for the geofiction site I built, that until now I’ve had to run manually. The problem I ran into comes down to permissions. Who knew that even the infamous Linux ‘root’ user is sometimes not the right person for the job? Emphasis added to the excerpt of the log file, below.

...
INFO: Total execution time: 16621 milliseconds.
Stopping renderd (via systemctl): renderd.service.
osm2pgsql version 0.95.0-dev (64 bit id space)
Using lua based tag processing pipeline with script ~/src/openstreetmap-carto/openstreetmap-carto.lua
Using projection SRS 3857 (Spherical Mercator)
Osm2pgsql failed due to ERROR: Connection to database failed: FATAL:  role "root" does not exist
...

It took me all day to figure this out. Not that I was working on it, exactly. Art and I went to town, did our shopping, came home, ate dinner. All the while, I was cogitating on this problem, and how it matched up with the results I was(n’t) seeing. And then, sitting there, it clicked.
These are the more pleasing moments of computer work – when a seemingly intractable problem presents itself and you work it through in your mind and you solve it. After it clicked, I came and opened the log file and saw the error, above, and it was an easy fix to the bash script.
picture

Caveat: debate-o-matic

One of the subjects that I taught to my students in Korea that I considered most valuable, both for the English skills it engendered as well as for general thinking ability, was debate.
I was the “debate teacher,” and I was well-known for even turning lessons otherwise structured into impromptu debates. The kids mostly seemed to get something out of it.
So now… they’re trying to make an AI (artificial intelligence) that can do debate – in the same way that we have machines now that play chess or baduk (“go”), that diagnose medical conditions or explore other planets. This is just another small step.
I watched this video.

I am both disappointed and impressed. This is often the case when confronting these odd black boxes that computer engineers are constructing these days. They can seem preternaturally smart and eerily stupid at the same time. The AI participating in this debate clearly had a lot of facts to hand, and was reasonably competent at marshaling them in a well-structured argument. But it missed the key thrust of its human opponent’s argument, and thus its rebuttal almost failed to make sense. I was somewhat annoyed that the moderators, who spent time afterward discussing what they’d just done, failed to bring this up.
picture

Caveat: Sitting in an 1880’s Ohunkagan brownstone, dreaming of an imaginary world named Arhet

[This is a cross-post from my other blog.]

I have utterly neglected this blog [meaning that other blog].

I offer no excuses. Just didn’t cross my mind. I had other things going on. I have other blogs and other, non-geofictional projects that occupy me.

In fact, I have been quite busy with geofiction, too. Over the last 6 months since my last blog post here, I have been developing my “Ohunkagan 1880” snapshot, at OpenGeofiction. This is my city in my fictional state of Makaska, in the parallel-universe US called FSA. Here is a screenshot of the city, in its 1880 incarnation. I intend to then roll the historical window forward, mapping in changes and additions, over the coming decades, until it catches up to the present.

picture

[Technical note: screenshot taken at this URL (for future screenshots to match).]

But I have also been working on my own, long-neglected map server. I have named my planet: Arhet.

It’s just a name. But one thing that always annoyed me about OGF was that the planet not only lacks a name, but there has always been strong community resistance to finding a consensus name for it. Someone is always bound to object to any proposal, and thus, “OGF world” remains unnamed. For my planet, I decided to just put a name on it from the start, so no one would end up grappling with the dilemma later.

Arhet is tentatively open to interested mappers. I’ve written up my current thinking on how this will work, here:

http://wiki.geofictician.net/wiki/index.php/Arhet

Music to make worlds by: The Youngsters, “Smile (Sasha Remix)”.

CaveatDumpTruck Logo

Caveat: Tree #259

I had a frustrating day, trying to repair my map server. I’m not sure if I’ve repaired it, now, but I got into one of those obsessive mindsets that made me recall that in fact, Arthur and I behave quite similarly around computers. Although I think I don’t cuss quite as much as he does. It seems to kind of work. Something amiss with the database.
In darkness, in rain, trees still lurk.
picture
picture[daily log: walking, 1km]

Caveat: Tree #257

It has been one of those rainy days that just demotivates a person. I have been spending some time installing some programming tools on my desktop and server, while I wait for my enrollment process to move forward for the University of Alaska Southeast Teacher Certification program. I suppose I’m more and more feeling that in the long run, I may end up doing computer work, and it would be smart to keep my skills up. Frustrated with the Eclipse IDE, I decided to try out VSCode, which is Microsoft’s entry to the Open Source IDE market. It’s a kind of weird reversal, running Microsoft software on a Linux machine. But so far it seems to work better than the buggy Eclipse.
A tree I saw the other day. Not very well focused.
picture
picture[daily log: walking, 1km]

Caveat: There, in the calm of some Platonic dream

This poem, below, was not written by a human being, as best I understand. It was written by one of those new “learning algorithm” AIs (Artificial Intelligences), where you give the AI a large pile of “training data” (i.e. in this case, a vast corpus of human-written poetry) and then say, more or less, “OK, give me a new one like that.” It works similarly to the way google-translate manages to make sense out of changing one language to another, without actually understanding a damn thing. It’s statistics, writ large.

Methinks I see her in her blissful dreams:
Or, fancy-like, in some mirage she lies,
Majestic yet majestic, and of seems
The image of the unconquerable skies.
Methinks I see her in her blissful dreams:
—Or, fancy-like, in some majestic cell,
Where lordly seraphs strew their balmy dreams
On the still night, or in their golden shell.
There, in the calm of some Platonic dream,
Sits she, and views the unclouded moon arise
Like a fair lady full of realms divine;
And, all at once, a stony face and bright
Glittering in moonlight, like the noon-tints of a night.

I found it, and other AI-generated poetry, on the slatestarcodex blog.
All very interesting.
 

Caveat: Git topo

[This is a cross-post from my other blog.]

I finally got tired of dealing with Windows 10 drama, and decided to rebuild my preferred Ubuntu Linux desktop, as I’d been using in Korea before moving away last July.

I’ve made good progress on that, and have JOSM up and working again, and all that. But I became aware, as I was migrating my data and files, that I have a lot of files I would rather not lose, especially related to my geofiction. I need some systematic means of keeping stuff backed up.

I handled the issue of backup and redundancy for my creative writing years ago, when I started storing all my drafts and notes in google docs. It’s convenient, too, because I can get to my writing no matter where I am.

But I have no such system for all my .osm files for the geofiction. Especially important are the .osm files I use for drawing the topo layer, since those are never uploaded anywhere except temporarily at the time of an update.

I suppose I could just copy the files. But I decided I needed to store them in some kind of version-controlled space. About two years ago, I’d had them in a git repository but it was just copied out to an extra harddrive. I used git for some other stuff I used to do, so it wasn’t that hard to figure out.

I decided this time to try something different – I made a repository on github and decided to put my topo .osm files there. If I get in the habit of regularly updating the git repository, I’ll always have those topo files, no matter what happens to my computer or where I am. Further, if ever I go in the direction of wanting to collaborate on drawing topo files, this will make it really easy (assuming the other person is up to dealing with checking things out of a git repository). [UPDATE: this was a short-lived effort. Subsequently the files are just files, again, but they live on one of my HRATE servers]

If ever there will be a truly collaborative geofiction “planet” with a master topo layer, this might be a way to maintain that information, since practically speaking it can’t and shouldn’t be uploaded to the map server. Just an experiment, I guess, and meanwhile I’ll have a reliable backup of my work.

Music to map by: 선미, “가시나.

CaveatDumpTruck Logo

Back to Top