Posts by robinsonb5

Caution: Non registered users only see threads and messages in the currently selected language, which is determined by their browser. Please create an account and log in to see all content by default. This is a limitation of the forum software.


Also users that are not logged in can not create new threads. This is a, unfortunately needed, counter measure against spam. Please create an account and log in to start new threads.

Don't Panic. Please wash hands.

    Initializing hard disk was slower than on older ST-cores. But it did find all 6 partitions from my hdf C,D,E,F,G,H.

    Accessing directories of hdf-file now time out frequently.

    Could you try the attached core, and let me know if it solves this issue, please?

    (I found a bug which I'd introduced into the bookmark code, which basically disabled the bookmarks. There's still a weird memory corruption / ROM loading issue which only appears to affect V2 hardware - still investigating that...)

    Actually the Minimig core doesn't have a flicker fixer - what it has is a scandoubler. A scandoubler stores each line of video, and outputs it at twice the speed, turning a 15.6KHz videomode into a 31.2KHz video mode. That's enough to make the PAL / NTSC video displayable by some PC monitors. However it doesn't remove the flicker in interlaced PAL / NTSC modes.


    Removing interlace flicker requires buffering two entire frames of video, then merging them and outputting them as a single frame at twice the speed. While the SDRAM in the Chameleon is just about fast enough to pull off that feat, it's not fast enough to do much else at the same time, so it's not something that can realistically be integrated into a core.


    Also, as Paul correctly points out, the 1084 can only handle 15KHz video modes, so even if the Chameleon had a flicker fixer, you wouldn't be able to use it with the 1084.


    The Minimig core does support ECS/AGA Multiscan / DblPAL / DblNTSC modes, and also RTG modes, all of which can provide a flicker-free high res display - but again these aren't usable with the 1084.

    Hi,

    Yes, I have been using same hdf and Platinum SD-card 32 GB with cluster size of 16k.

    Because it is older card, I just freshly formatted new Sandisk Extreme Pro 32 GB with cluster size of 16k.

    Unfortunately ST-core behaves just like using Platinum-card.

    My hdf-file for ST is 3 Gb split to 6 partitions, could that 3 Gb size cause problems?

    Thanks, that's really helpful - I'll do some testing myself with smaller clusters and see what happens. (In short, you can't easily "rewind" a file on a FAT filesystem - you can follow cluster chains forwards but not backwards - so if you read the last byte of a file and then a byte from the previous cluster you have to start from the beginning of the file and follow the entire cluster chain through the file - which is slow. I currently use a "bookmarks" system to speed this up (a more traditional full index won't fit in the BRAM) - but I can increase the number of bookmarks and tweak the algorithm.


    I'm currently closing a business but in the next week or two I should have more time to devote to tidying up the cores and their threads.

    That's very weird, since none of the changes should have impacted performance.

    This is definitely with exactly the same SD card and hard file?


    (It's normal for the OSD to be slow during hard disk activity, but if it's non-responsive for long periods then I suspect the problem's with seeking from one part of the hard file to another. I don't suppose you know what the cluster size is on your SD card's FAT filesystem? If you have Linux handy you can check by typing "dosfsck -v -n /dev/<whatever>". The smaller the cluster size the slower seeking will be.)

    Here's a beta version which should fix the hard disk image corruption problem.

    (I'm still seeing occasional ROM checksum errors on V2 hardware - no idea why yet - but if you see it, just re-loading the ROM should fix it.)


    For consistency with other cores, the config files are now accessed from a second menu page, to the right of the main menu.

    @robinsonb5 Do you think that the 15khz RGB cores would work with any of the various RGB to composite/svideo converters that have been produced?


    https://www.manualslib.com/man…tsc-Encoding-Iec-843.html

    I'd be very surprised if the Minimig core didn't work with it, since the original Minimig had a composite encoder chip on board. I'm not sure about the other cores - would have to try them and see, I guess.


    (Now the sync issue is more "on my radar" I may end up updating the other cores to produce "correct" composite syncs at some point.)

    Maybe I'm misunderstanding, but it sounds like you're saying that most cores output 240p sync instead of 480i sync? But 240p sync is still a valid composite sync signal that many composite monitors will sync to, right? It's not exactly what we want, but it seems like it might actually work well enough to get a readable picture on the screen.

    The confusion is that there's more to sync than just how many wires are involved.


    TV video standards specify one particular pattern of sync pulses during the vertical sync interval to indicate an even frame, and a different pattern to indicate an odd frame. Normal broadcast video (and the Amiga in interlace mode) alternates between these.


    A "correct" 240p (or 288p) signal would generate a valid sync pattern for either even or odd frames, and not alternate between them - the Amiga does this in non-interlaced modes. (The MInimig core also does this, but it may not be doing it correctly.)


    Most other cores aren't outputting either type of sync pattern - they're simply combining the H and V sync signals into a single pin, generally with an XOR. It's not "correct", but works in practice as you've seen. (The reason is that the cores are written to be used with the scandoubler - being able to use them with a 15KHz monitor is a bonus. In the vast majority of cases it works fine, but no guarantees, your mileage may vary.)


    I think what Jens is saying - correct me if I'm wrong - is that there's no scope for generating the "correct" composite sync patterns in the Chameleon core.

    To my knowledge, just an XOR does not make a proper even/odd field designation.

    Indeed - that's my point - most of the cores in question (and the systems they imitate) don't make any attempt to interlace their video. If they do serration at all they either output all-even or all-odd frames.

    (I just checked the code for the NES, Sega Master System and Turbografx16 cores, and I don't see any code for serration in any of them. Minimig does have them, but I'm not convinced that they're implemented correctly - as discussed further up the thread.)

    As turrican9 says most of the core's I've worked on or ported support 15KHz output. With the Minimig core, once you've forced the screenmode with an F key at startup, the mode you've chosen will be saved when you save a config file.


    For the "DeMiSTified" cores (basically anything I've ported in the last couple of years except Rampage) then holding down the menu button / key for a couple of seconds will toggle the scandoubler. The few cores which support config files (off the top of my head, the MiSTery and BBC cores) will also save the scandoubler setting as part of the config.


    For future cores (if they don't need or support full config files) I intend to make them look for a "15Khz.cfg" file on the SD card, and disable the scandoubler if it's found.

    Funnily enough I have a Transcend 8GB card which looks exactly like that, and it's not reliable for me either. It's recognised but I have corruption problems with it. Maybe there's an unusually large number of counterfeits of that particular type?

    (I have a few other Transcend cards which are fine, including a 16GB card which contains my main Minimig installation.)

    That's what I thought, but I couldn't get the screenmode to change - I wasn't sure if $d040 was read-only, hence the question.

    If I type

    POKE 53502,42

    followed by

    POKE 53499,240

    then the VGA debugging overlay appears, as expected.


    Then typing

    POKE 53312,n

    seems to have no effect, no matter the value of n.


    The screenmode query registers all return 0, too, whatever value's poked to 53315.


    Coniguration mode is clearly still active, since I can show and hide the VGA debugging overlay by poking to 53499 - is there some other step I'm missing? (Maybe the VGA registers only work in Menu mode?)

    So you could probably filter out jitter by creating a counter for each of the four possible mouse sensor values (00, 01, 10, 11) that keeps track of how many times you've seen each particular state, and then you don't activate the mouse until all four counters reach a certain value. But I'm not an FPGA programmer, so I don't know quite how difficult that is to do, and just clicking the mouse to activate isn't a bad solution anyway.

    It'd certainly be possible to set up some kind of elaborate filtering for the mouse (and if I were going to do such a thing I'd probably move the switch into the heart of Minimig itself instead of the board-specific outer layers since Denise already has counters derived from the quadrature signal.) But like I said, the FPGA's full to bursting, so every logic element counts.


    (With FPGA's its not as simple as "fits" / "doesn't fit" - the more spare space there is the easier a time the toolchain has in placing and routing the logic and achieving the necessary timing. The Minimig core has always been squirrelly timing-wise, and the FPGA being 99% full is *really* not helping. I've just done a build on V2 which uses 24,261 logic elements out of an available 24,624, and it fails timing by a mere 83ps on just one path - that's the kind of result that screams "Don't touch a thing!")

    Guys - you're doing something that is known to lead to the wrong path: You're optimizing something without even knowing if it needs optimization. Simplicity is king (not just on a hardware level, but in engineering in general), so I suggest to see if the Port 1/3 mouse swapping feature even needs optimization.

    Indeed - there's always a balance to be struck between finesse and simplicity, so for now I'll go with "any direction or fire button switches to joystick", and "either left or right mouse button switches to mouse."

    What you're describing appears to be what Alastair is about to implement. Or I might be mis-reading even more :-)

    More or less - the scheme I described involved clicking the mouse button or pressing the joystick fire button to make the actual switch happen, and Paul has pointed out that you might want to move the joystick before pressing the fire button if the first time you touch it is when a game's menu screen appears. Therefore it would be good if direction events could also trigger the switch to the joystick port.

    (3) If you move the mouse again and it detects "impossible" values on port 3 again, then it switches back to using the mouse on port 3.

    There will need to be a mouse click to switch back to the mouse on port 3, because a parked mouse can emit any combination of direction signals, and can also jitter - so there'd be a danger of it constantly switching back to mouse while you're trying to use the joystick. (I might have considered implementing some complex filtering to avoid this problem, just as a finesse point - but the FPGA is full to bursting, so I have to keep it simple!)