Someone asked Amazon for a report on everything that their Alexa listening device had listened to since they first got it. It was enormous.
Think about which words Amazon might have set it up to alert various US government departments about — and which departments those might be. (That might depend on which country you are in.)
Amazon never hears anything in my home.
Aside from the danger of remote surveillance and control, digital systems to control parts of your home have a tendency to be too smart for your own good.
When I looked for an apartment in 2019, I rejected outright any building with digital rather than metal keys. I do not trust whoever controls the system not to track residents' movements. I also rejected outright any building with a security camera.
The Food and Drug Administration tried to reject the new mRNA flu vaccine without even examining it, in accord with antivax ideology, but has changed its mind and will now examine it.
Many plug-in hybrid cars use considerably more fuel than manufacturers claim. How much more, varies from model to model.
A bill has been proposed in California to require all 3D printers to come equipped with gun-pattern detectors, starting in 2029.
In principle, I don't defend the right to make your own guns. I don't think what this law requires is inherently unjust.
However, there will be a tendency to implement it by making it impossible or illegal to modify the software in a 3D printer. That would be unjust.
The US government's museum about the US Constitution has fired its president, and people suspect he was fired because he wasn't a magat.
Criticizing society's assumptions of how to collect and then use data, and what makes data valid or useful.
Volunteer medics in UK and US who have denounced atrocities committed by Israelis that they witness while in Gaza say that Israel has refused to let them return to Gaza to work as medics again.
This retribution seems designed to intimidate present and future medics in Gaza so that they will help Israel cover up atrocities.
US citizen George Retes was arrested by deportation thugs while he was commuting to work, then jailed for days without access to family, an attorney, or information about the charges against him. He is now free, and suing. I hope he wins, but this is not enough.
The fact that Retes is an Army veteran does not seem significant to me. It would be just as bad if they did this to someone who was never in the US military.
I believe in being very tough on government crime.
When individual official thugs break the law in a significant way, they deserves to pay a penalty, such as a prison sentence. When the government agency broke the law, the agency also should pay a penalty. But when the agency follows a general practice of breaking the law, and that practice was accepted by upper management, the upper management should get the prison sentence.
The US told Europe it still wants an alliance — but only with tyrannical, right-wing countries.
The EU should avoid agreeing to this, and avoid being scared by the threats, while accepting the alliance as if it didn't have those conditions. This way it can play for time hoping the US kicks out these right-wing jerks, and use the time to prepare for the US to break off with them.
Monarch, Legacy of Monsters has a fantastic title sequence. Since the show takes place in two timelines, they split the titles between the past on the left and the present on the right, contrasting similar events in each timeline. And season 2 keeps up this conceit, but it was completely rebuilt!
So here are all four quadrants from seasons 1 and 2, stacked.
Also, this show is still killing it.
In my online undergraduate P5.js course, students are about to begin the module on motion and physics, including a bit of physics simulation using Matter.js. It suddenly occurred to me that I had never seen anybody put together this particular demo before, and I realized it had to be done. Messy source code here.
Previously, previously, previously, previously, previously, previously.
E.g. "unicrud --block Katakana".
The actual XFT font I get from XftFontOpenXlfd("-*-sans serif-bold-r-*-*-*-180-*-*-*-*-*-*") is
Noto Sans-300 :familylang=en :style=Bold :stylelang=en :fullname=Noto Sans Bold :fullnamelang=en :slant=0 :weight=200 :width=100 :pixelsize=401.899 :foundry=GOOG :antialias=True :hintstyle=1 :hinting=True :verticallayout=False :autohint=False :globaladvance=True :file=/usr/share/fonts/truetype/noto/NotoSans-Bold.ttf :index=0 :outline=True :scalable=True :dpi=96.4557 :rgba=5 :scale=1 :minspace=False :fontversion=131334 :capability=otlayout\:DFLT otlayout\:cyrl otlayout\:grek otlayout\:latn :fontformat=TrueType :embolden=False :embeddedbitmap=True :decorative=False :lcdfilter=1 :namelang=en :prgname=unicrud :postscriptname=NotoSans-Bold :color=False :symbol=False :variable=False :fonthashint=True :order=0 :namedinstance=False :fontwrapper=SFNT
The articles, produced in collaboration with the investigative collective WAV, detailed a years-long, multi-ministry charm offensive by Palantir to sell its software to Swiss federal authorities. The campaign was, by all accounts, a comprehensive failure. Swiss agencies rejected Palantir at least nine times, with concerns ranging from data sovereignty to reputational risk to the simple fact that nobody needed the product. [...]
So how does a sophisticated data intelligence company respond to well-sourced investigative journalism based on official government documents?
By suing the journalists, of course.
Previously, previously, previously, previously, previously, previously, previously.
In 1995 someone could have written a paper which went like this (using modern vernacular) and advanced the field of AI by decades:
The central problem with building neural networks is training them when they’re deeper than two layers due to gradient descent and gradient decay. You can get around this problem by building a neural network which has N values at each layer which are then multiplied by an NxN matrix of weights and have Relu applied to them afterwards. This causes the derivative of effects on the last layer to be proportionate with the effects on the first layer no matter how deep the neural network is. This represents a quirky family of functions whose theoretical limitations are mysterious but demonstrably work well for simple problems in practice. As computers get faster it will be necessary to use a sub-quadratic structures for the layers.
History being the quirky thing that it is what actually happened is decades later the seminal paper on those sub-quadratic structures happened to stumble across making everything sublinear and as a result people are confused as to which is actually the core insight. But the structure holds: In a deep neural network, you stick to relu, softmax, sigmoid, sin, and other sublinear functions and magically can train neural networks no matter how deep they are.
There are two big advantages which digital brains have over ours: First, they can be copied perfectly for free, and second, as long as they haven’t diverged too much the results of training them can be copied from one to another. Instead of a million individuals with 20 years experience you get a million copies of one individual with 20 million years of experience. The amount of training data current we humans need to become useful is miniscule compared to current AI but they have the advantage of sheer scale.
I have this pulley wheel, 50mm inside diameter, 4mm groove. I need a rubber traction ring to go inside it. I cannot find anyone who will sell this to me.
The ring must be flat or concave, not round like a typical gasket seal O-ring, or the string its pulling will just slide off the track.
Alternately, any similar-sized metal pulley wheel that comes with a friction surface pre-attached, 8mm axis hole with set screw.
I have tried coating it with sugru, but that is too soft and wears off after not-very-long.
Update: If you're going to say "why don't you just" or "have you searched for" without a purchase link to a product of the correct size, please know that you are not helping.
Yes, naturally, you would need a prosthetic butthole for this movie about a celibate religious sect, Amanda. I completely agree.
"I was pregnant and naked, but I wasn't naked at all, and at the end of the movie, I'm standing in front of a burning building with just a merkin," she explained. "I felt so free."
Unfortunately, she did clarify: "You cannot see my butthole in the scene, but I swear there is a prosthetic butthole there." Release the butthole cut.
Jason Scott of Internet Archive was kind enough to digitize them for us. And these nearly-40-year-old VHS tapes turned out to be of surprisingly high quality! The very high resolution scans of the raw tapes are at Internet Archive.
I've also split them apart and uploaded them to YouTube, so here's a playlist of more than 24 hours of live performances at DNA Lounge spanning the years 1988 through 1992! Plus some other stuff.
We are also hosting a memorial for Spencer on the afternoon of Sat, Apr 4. If you knew him, please stop by!

Everything written by AI boosters tracks much more clearly if you simply replace "AI" with "cocaine".
I shall demonstrate!
(Not linking to OP, because it's trash.)
"Let's pretend you're the only person at your company using cocaine.
You decide you're going to impress your employer, and work for 8 hours a day at 10x productivity. You knock it out of the park and make everyone else look terrible by comparison. [...]
In this scenario, you capture 100% of the value from your adopting cocaine."
Previously, previously, previously, previously, previously, previously.
I have two YouTube accounts, jwz and dnalounge, and I'm using the oauth API with both of them to automate uploads and stuff. With the DNA account, I am getting a refresh_token that lasts forever. But with the jwz one, I am getting a refresh_token that can only refresh the access_token for a week, and then I have to log in again. Any ideas what fuckery is afoot?
The DNA token does this:
GET https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=DNA_ACCESS_TOKEN_1 => access_type => "offline", audience => "DNA_PROJECT_ID.apps.googleusercontent.com", expires_in => 3574, issued_to => "DNA_PROJECT_ID.apps.googleusercontent.com", scope => "https://www.googleapis.com/auth/youtube" POST https://accounts.google.com/o/oauth2/token client_id => "DNA_PROJECT_ID.apps.googleusercontent.com", client_secret => "DNA_CLIENT_SECRET", grant_type => "refresh_token", refresh_token => "DNA_REFRESH_TOKEN" result: access_token => "DNA_ACCESS_TOKEN_2", expires_in => 3599, scope => "https://www.googleapis.com/auth/youtube", token_type => "Bearer" token expiration 0:00:59:34 => 0:00:59:59
but the jwz token does this:
GET https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=JWZ_ACCESS_TOKEN_1 => access_type => "offline", audience => "JWZ_PROJECT_ID.apps.googleusercontent.com", expires_in => 3413, issued_to => "JWZ_PROJECT_ID.apps.googleusercontent.com" scope => "https://www.googleapis.com/auth/youtube", POST https://accounts.google.com/o/oauth2/token client_id => "JWZ_CLIENT_ID", client_secret => "JWZ_CLIENT_SECRET", grant_type => "refresh_token", refresh_token => "JWZ_REFRESH_TOKEN" result: access_token => "JWZ_ACCESS_TOKEN_2", expires_in => 3599, refresh_token_expires_in => 201701 scope => "https://www.googleapis.com/auth/youtube", token_type => "Bearer", token expiration 0:00:56:53 => 0:00:59:59 refresh expires in 2:00:01:41
Maybe I'm logging in wrong? I log in with user/pass/TOTP "jwz@jwz.org" which takes me to the channel "@yesthatjwz" then I load:
https://accounts.google.com/o/oauth2/auth?client_id=JWZ_PROJECT_ID.apps.googleusercontent.com&redirect_uri=https://localhost&response_type=code&scope=https://www.googleapis.com/auth/youtube&access_type=offline
and it asks me to choose my "brand" account. There are three listed: "DNA Lounge", "yesthatjwz", and another "jwz" account. The selection that works is the "yestthatjwz" one. The mystery account is @alsojwz1853 and I don't know why it exists but I'm afraid to delete it in case that breaks something.
When I sign in with "jwz@jwz.org", it takes me directly to my real channel, @yesthatjwz.
When I sign in with: "yesthatjwz" or "youtube@jwz.org" or "yesthatjwz@jwz.org", it asks me to select a channel: @yesthatjwz or "also jwz" @alsojwz1853.
Trying to sign in with "alsojwz1853" says "could not find your account".
Another clue: both the "DNA Lounge" and "yesthatjwz" accounts work with or without at-signs, /dnalounge, /@dnalounge, /yesthatjwz and /@yesthatjwz, but the other one only works as /@alsojwz1853, not as /alsojwz1853. Maybe because they are old accounts that pre-date YouTube being purchased by Google? Another difference is that the thing in console.cloud.google.com/auth/clients/*_PROJECT_ID for DNA is an "iOS client" created in 2014, but for "jwz" is a "Desktop client" created in 2024. There don't seem to be any settings.
But I still don't understand why the DNA and jwz accounts have different behavior.
Planet Debian upstream is hosted by Branchable.

