cultural reviewer and dabbler in stylistic premonitions
It looks huge on a Mercator Projection map even though it isn’t that large.
In the Mercator projection it appears to have about the same area as Africa, while in reality it is about a 14th of it. But, I wouldn’t say that “isn’t that large”: if Greenland was independent it would be (and Denmark is, because of it) the 12th largest country in the world.
I see. What a mess.
The instructions at https://docs.searxng.org/admin/installation-docker.html mention that the docker image (which that page tells you to just pull and run) has its “sources hosted at” https://github.com/searxng/searxng-docker and has instructions for running it the image without docker-compose.
But, the Dockerfile
source for the image is actually in the main repo at https://github.com/searxng/searxng/blob/master/Dockerfile and the searxng-docker
repo actually contains a docker-compose.yaml
and different instructions for running it under compose instead.
Anyway, in the docker-compose
deployment, SEARXNG_BASE_URL
(yet another name for this… neither SEARXNG_URL
or BASE_URL
, but apparently it sets base_url
from it) is constructed from SEARXNG_HOSTNAME
on line 58 here: https://github.com/searxng/searxng-docker/blob/a899b72a507074d8618d32d82f5355e23ecbe477/docker-compose.yaml#L58
If I had a github account associated with this pseudonym I might open an issue or PR about this, but I don’t and it isn’t easy to make one anymore 😢
Changing
SEARXNG_HOSTNAME
in my.env
file solved it.
nice. (but, i assume you actually mean SEARXNG_URL
? either that or you’re deploying it under some environment other than one described in the official repo, because the string HOSTNAME
does not appear anywhere in the searxng repo.)
https://docs.searxng.org/admin/settings/settings_server.html says you need to set base_url
, and that by default it’s set to $SEARXNG_URL
.
however, https://docs.searxng.org/admin/installation-docker.html#searxng-searxng says that if you are running it under docker the environment variable which controls base_url
in the config is actually BASE_URL
rather than SEARXNG_URL
.
(possibly whichever variable it is is currently empty, which might make it construct a URL based on the IP address it is configured to listen on.)
in my experience DeepL has the best results for some language pairs while Google is better for others (and has a lot more languages).
But, these days I’m happy to say Firefox translate is the first thing I try and it is often sufficient. I mostly only try the others now when the Firefox result doesn’t make sense or the language is unsupported.
Yeah, that would make sense - language detection is trivial and can be done with a small statistical model; nothing as complicated as a neural network is needed, i think just looking at bigram frequency is accurate enough when you have more than a few words.
If that is what is happening, and it is only leaking the language pair to the server the first time that pair is needed, that would be nice… I wish they made it clear if that is what is happening 😢
Probably that’s when it does online connection?
since the help says it is downloading “partial language files” automatically, and the button never changes from “Download” to “Remove” if you don’t click Download, logically it must sometimes need to download more of a language which you have previous downloaded a “partial language file” of.
i am curious if the choice of which parts of the “language file” (aka model) it is downloading really does not reveal anything about the text you’re translating; i suspect it most likely does reveal something about the input text to the server… but i’m not motivated enough to research it further at the moment.
Wow, thanks for the about:translations
tip - I was wondering how to do that!
Besides “Translate page” there is also a “Translate selection” option in the right-click menu so you can translate part of a page.
However, unless you download languages in the “Translation” section of Firefox preferences, it doesn’t actually always work while offline:
As you pointed out, the help page explicitly says there is “no privacy risk of sending text to third parties for analysis because translation happens on your device, not externally”, but, after I translate something in a new language I haven’t before, it still doesn’t appear as downloaded (eg having a “Remove” button instead of a “Download” button) in the preferences.
The FAQ has a question Why do I need to install languages? with this answer:
Installing languages enables Firefox to perform translations locally within your browser, prioritizing your privacy and security. As you translate, Firefox downloads partial language files as you need them. To pre-install complete languages yourself, access the language settings in Firefox Settings,
General
panel, in the Language and Appearance section under Translations.
I wonder what the difference between the “partial” language files and the full download is, and if that is really not leaking any information about the text being translated. In doing a few experiments just now, I certainly can’t translate to new languages while offline, but after I’ve translated one paragraph in a language I do seem to be able to translate subsequent paragraphs while offline. 🤔
Anyway, it probably is a good idea to click “Download” on all the languages you want to be able to translate.
The server isn’t exposed to the internet. It’s a local IMAP server.
if it is processing emails that originate from the internet, it is exposed to the internet
security updates are for cowards, amirite? 😂
seriously though, Debian 7 stopped receiving security updates a couple of years prior to the last time you rebooted, and there have been a lot of exploitable vulnerabilities fixed between then and now. do your family a favor and replace that mailserver!
From the 2006 modification times, i wonder: did you actually start off with a 3.1 (sarge) install and upgrade it to 7 (wheezy) and then stopped upgrading at some point? if so, personally i would be tempted to try continuing to upgrade it all the way to bookworm, just to marvel at debian stable’s stability… but only after moving its services to a fresh system :)
I asked this question the other day if I could somehow input my handwritten notes into programs like Trilium (or logseq whatever) and memos. OCR/HCR seems to far behind still so I am unsure.
I just left this comment on your post.
Via the pine64 blog update about their e-ink tablet TIL about inkput (using OnlineHTR) which appears to be a step in the right direction.
Similar to this: https://github.com/alibahmanyar/breaklist
Relatedly, there was a company was selling a cloud(🤡)-based product called “Little Printer” from 2012 to 2014; after their backend predictably shut down, some fans of it recreated it as https://tinyprinter.club/ and later https://nordprojects.co/projects/littleprinters/
somehow input my handwritten notes
I’ve heard the reMarkable e-ink tablet’s cloud service has good-enough-to-be-usable handwriting recognition, but sadly I haven’t heard of anything free/libre and/or offline that is.
Brendan Howell’s The Screenless Office is “a system for working with media and networks without using a pixel-based display. It is an artistic operating system.”
You can “read and navigate news, web sites and social media entirely with the use of various printers for output and a barcode scanner for input”.
weird, i wonder why. i just checked on an ubuntu 24.04 system to confirm it is there (and it is).
i guess your computer’s power button might not be supported (out of the box, at least) by Linux’s acpi implementation :(
I disagree, the headline is clickbaity and implies that there is some ongoing conflict. The fact that the Fedora flatpak package maintainer pushed an update marking it EOL, with “The Fedora Flatpak build of obs-studio may have limited functionality compared to other sources. Please do not report bugs to the OBS Studio project about this build.” in the
end-of-life
metadata field the day before this article was written is not mentioned until the second-to-last sentence of it. (And the OBS maintainer has since said “For the moment, the EOL notice is sufficient enough to distance ourselves from the package that a full rebrand is not necessary at this time, as we would rather you focus efforts on the long-term goal and understand what that is.”)The article also doesn’t answer lots of questions such as:
Note again that OBS’s official flathub flatpak is also marked EOL currently, due to depending on an EOL runtime. Also, from the discussion here it is clear that simply removing the package (as the OBS dev actually requested) instead of marking it EOL (as they did) would leave current users continuing to use it and unwittingly missing all future updates. (I think that may also be the outcome of marking it EOL too? it seems like flatpak maybe needs to get some way to signal to users that they should uninstall an EOL package at update time, and/or inform them of a different package which replaces one they have installed.)
TLDR: this is all a mess, but, contrary to what the article might lead people to believe, the OBS devs and Fedora devs appear to be working together in good faith to do the best thing for their users. The legal threat (which was just in an issue comment, not sent formally by lawyers) was only made because Fedora was initially non-responsive, but they became responsive prior to this article being written.