I write code and play games and stuff. My old username from reddit and HN was already taken and I couldn’t think of anything else I wanted to be called so I just picked some random characters like this:

>>> import random
>>> ''.join([random.choice("abcdefghijklmnopqrstuvwxyz0123456789") for x in range(5)])
'e0qdk'

My avatar is a quick doodle made in KolourPaint. I might replace it later. Maybe.

日本語が少し分かるけど、下手です。

Alt: e0qdk@reddthat.com

  • 38 Posts
  • 127 Comments
Joined 1 year ago
cake
Cake day: September 22nd, 2023

help-circle





    • Sora wa Takaku Kaze wa Utau from Fate/Zero (2nd cour) – it’s a powerful song, and I think I listened to this one all the way through in every episode. Definitely one of my all time favorites.
    • Taiyou - Denpa-Teki na Kanojo. (This OVA is pretty obscure, I think.) Another powerful song. No visuals for most of the ending (just text credits scrolling) – although 神戸守 (Mamoru Kanbe) listed as the director (監督) jumped out at me! No Klimt this time, but funny that I’m talking about something he worked on again already. Maybe I should go track down his other works more systematically…
    • Kesenai Tsumi - FMA 2003 – I have a lot of nostalgia for this song and listened to it way too much as a teenager after my friends started introducing me to anime. The version on animethemes is a bit different from what I remember visually but the song is the same.
    • Wareta Ringo - Shin Sekai Yori – I was actually thinking about posting an animepic clipped from this the other day since it popped back into my mind…
    • Hibari - Lord El-Melloi II Sei no Jikenbo: Rail Zeppelin Grace Note (the Fate/Zero spin-off series) – I like both the song and the visuals (with the seasons changing)
    • My Pace – Bleach ED6; I didn’t much care for the filler seasons of Bleach, but the synth from this ED and the dancing characters got stuck in my head for a while.
    • X Jigen e Youkoso - Space Dandy. This one is memorable to me both for the “Hey, Everett…”/「ねぇ、エヴェレット」bit specifically and the general subject of the song.
    • Zzz - Nichijou – both the art and song are great. There’s a couple versions, but I like this first one the best.

    Edit: corrected the link to the Space Dandy ED


  • Have you tried Resonance? It’s a mystery adventure game set in modern times where you play as four different characters whose stories interconnect. It’s been a while since I played it (a decade or so?) but I remember that it had an interesting game mechanic that let you use memories like items in various interactions, as well as a number of puzzles that I rather liked the design of.


  • Sorry if addressed in the link (I’m not willing to visit Twitter) – but, like, actually McDonald’s themed? Or are they just sponsoring a show (like P&G, etc. have done for ages)?

    If the former, I guess there’s some precedence with the KFC visual novel and Isekai Izakaya and such, but that still sounds pretty weird…

    Edit: I went back and checked and it looks like McDonald’s was also a sponsor on the show I remember P&G from (i.e. season 1 of Bleach), so there’s precedence for them sponsoring Studio Pierrot’s shows too – I just don’t usually pay that much attention to it, I guess.



  • It’s not a GUI library, but Jupyter was pretty much made for the kind of mathematical/scientific exploratory programming you’re interested in doing. It’s not the right tool for making finished products, but is intended for creating lab notebooks that contain executable code snippets, formatted text, and visual output together. Given your background experience and the libraries you like, it seems like it’d be right up your alley.


  • I mentioned in a past comment a while back that I made a catalog of my anime. One of the observations I found while making it is that everything except for one movie had an entry on the English language Wikipedia already. That movie is Gundress from 1999. According to my personal journal, I watched this once back in 2014, apparently, but I remembered nothing about it, so I loaded it up recently and rewatched it.

    The movie has that “sort of hard to follow if you don’t already know the source material” kind of feel – although I think this is the original work? I checked the Japanese Wikipedia entry about it after watching it. Sticking the article through a translator, there’s a description of a seriously screwed up initial showing and mismanagement of production with the film being finished after it aired in theaters initially. The version I have is finished, of course; if half the movie wasn’t colored in I’d definitely have remembered that!

    The DVD menu prominently credits it as “Masamune Shirow’s Gundress”, but I’m not sure what his role in the production actually was. He’s listed in the opening credits for 設定協力 which got translated to English as “Characters Designed by” – but different people are credited with character and mech design in the end credits. A literal translation is something like “setting cooperation”.

    There’s definitely a number of familiar elements with some buildings reminiscent of Dominion Tank Police, mech suits that reminded me of designs in GitS:SAC, as well as thermoptic camouflage, cable-based cyborg communication (jacked into the neck), cyberdiving, etc. coming up during the story.

    Unusually, this anime features a Little Arabia enclave within the Japanese “Bayside City” the story is set in and one of the main characters is Muslim. I think this may be the only time I’ve seen Arabic script in anime – although I don’t know what it says.

    I clipped some screenshots and stacked them up so you can see what it looks like, if you’re curious: https://files.catbox.moe/qtsa0d.png (~8MB)








  • I was just thinking about the image resizing thing again when I saw your message notice pop up. Another option for preview is a web browser. A minimal HTML page with some JS to refresh the image would avoid the image resize on reload problem, and gives you some other interesting capabilities. Python ships with a kind of meh (slow and quirky), but probably sufficient HTTP server (python3 -m http.server) if you’d prefer to load the preview on a different computer on your LAN entirely (e.g. cellphone / tablet / … ) for example.

    A simple HTML file for this would be something like:

    <!DOCTYPE html>
    <html>
      <head>
        <style>
          html, body {
            background-color: #000000;
          }
        </style>
        <script>
          function reload()
          {
              let img = document.getElementById("preview");
              let url = new URL(img.src);
              url.searchParams.set("t", Date.now());    // cache breaker; force reload
              img.src = url.href;
          }
    
          function start()
          {
              setInterval(reload, 500);
          }
        </script>
      </head>
      <body onload="start()">
        <img id="preview" src="output.png">
      </body>
    </html>
    
    

    Regarding input from a gamepad – I’ve had some similar ideas before but haven’t really had much success using a gamepad artistically outside some limited things where I either wrote the entire program or was able to feed data into programs that accepted input over the network (e.g. via HTTP and which I wrote a custom adapter for). It’s been a long time since I’ve tried anything in that space though, and it might be possible to do something interesting by trying to make the system see the combination of a gamepad stick as relative mouse motion and trigger as pen pressure. I’m not quite sure how to go about doing that, but I’ll let you know if I find a way to do it.


  • The Wikipedia article for hqx points out that an implementation exists as a filter in ffmepg.

    You can run a command line conversion of e.g. a PNG -> PNG using hqx upscaling like: ffmpeg -i input.png -filter_complex hqx=4 output.png

    The =4 is for 4x upscaling. The implementation in my version of ffmpeg supports 2x, 3x, and 4x upscaling.

    As a quick and dirty way to get semi-live preview, you can do the conversion with make and use watch make to try to rebuild the conversion periodically. (You can use the -n flag to increase the retry rate if the default is too long to wait.) make will exit quickly if the file hasn’t changed. Save the image in your editor and keep an image viewer that supports auto-reload on change open to see “live” preview of the output. (e.g. eog can do it, although it won’t preserve size of the image – at least not in the copy I have, anyway; mine’s a bit old though.)

    Sample Makefile:

    output.png : input.png Makefile
    	ffmpeg -y -i input.png -filter_complex hqx=4 output.png
    
    

    Note the -y option to tell ffmpeg to overwrite the file; otherwise it will stop to ask you if you want to overwrite the file every time you save, and in case you’re not familiar with Makefiles, you need a real tab (not spaces) on the line with the command to run.

    ffmpeg also appears to support xbr (with =n option as well) and super2xsai if you want to experiment with those too.

    I’m not sure if this will actually do what you want artistically, but the existing implementations in ffmpeg makes it easy to experiment with.



  • Can Z3 account for lost bits? Did it come up with just one solution?

    It gave me just one solution the way I asked for it. With additional constraints added to exclude the original solution, it also gives me a second solution – but the solution it produces is peculiar to my implementation and does not match your implementation. If you implemented exactly how the bits are supposed to end up in the result, you could probably find any other solutions that exist correctly, but I just did it in a quick and dirty way.

    This is (with a little clean up) what my code looked like:

    solver code
    #!/usr/bin/env python3
    
    import z3
    
    rand1 = 0.38203435111790895
    rand2 = 0.5012949781958014
    rand3 = 0.5278898433316499
    rand4 = 0.5114834443666041
    
    def xoshiro128ss(a,b,c,d):
        t = 0xFFFFFFFF & (b << 9)
        r = 0xFFFFFFFF & (b * 5)
        r = 0xFFFFFFFF & ((r << 7 | r >> 25) * 9)
        c = 0xFFFFFFFF & (c ^ a)
        d = 0xFFFFFFFF & (d ^ b)
        b = 0xFFFFFFFF & (b ^ c)
        a = 0xFFFFFFFF & (a ^ d)
        c = 0xFFFFFFFF & (c ^ t)
        d = 0xFFFFFFFF & (d << 11 | d >> 21)
        return r, (a, b, c, d)
    
    a,b,c,d = z3.BitVecs("a b c d", 64)
    nodiv_rand1, state = xoshiro128ss(a,b,c,d)
    nodiv_rand2, state = xoshiro128ss(*state)
    nodiv_rand3, state = xoshiro128ss(*state)
    nodiv_rand4, state = xoshiro128ss(*state)
    
    z3.solve(a >= 0, b >= 0, c >= 0, d >= 0,
      nodiv_rand1 == int(rand1*4294967296),
      nodiv_rand2 == int(rand2*4294967296),
      nodiv_rand3 == int(rand3*4294967296),
      nodiv_rand4 == int(rand4*4294967296)
      )
    
    

    I never heard about Z3

    If you’re not familiar with SMT solvers, they are a useful tool to have in your toolbox. Here are some links that may be of interest:

    Edit: Trying to fix formatting differences between kbin and lemmy
    Edit 2: Spoiler tags and code blocks don’t seem to play well together. I’ve got it mostly working on Lemmy (where I’m guessing most people will see the comment), but I don’t think I can fix it on kbin.



  • If I understand the problem correctly, this is the solution:

    solution

    a = 2299200278
    b = 2929959606
    c = 2585800174
    d = 3584110397

    I solved it with Z3. Took less than a second of computer time, and about an hour of my time – mostly spent trying to remember how the heck to use Z3 and then a little time debugging my initial program.