Indivision AGA Mk3 and 1080p HD televisions via HDMI

Caution: Non registered users only see threads and messages in the currently selected language, which is determined by their browser Please create an account and log in to see all content by default. This is a limitation of the forum software.
Don't Panic. Please wash hands.
  • Hi All,

    I'm having limited success getting my new Indivision AGA Mk3 to work at 1080p on an HDMI tv, and I wondered if anyone else was having more luck? Apologies in advance for the absolute wall of text...


    I've tried it with a couple of different Sony HDTVs we have in the house, both seem stable at 1280x1024p60 but not at 1920x1090p60 or 1920x1090p50. I don't think it's the Amiga 1200 as it's stable at the lower resolution but I guess it could be, if increasing the Mk3 output resolution makes the A1200 do more work somehow and get hot/draw more current? I did wonder if it might be the psu making the Mk3 unstable if it is drawing more current when working at higher resolutions but it could just be my timings are out, so I thought I'd ask on the forum and see if anyone else was trying the same things as me.


    I used the latest 1.3 tool and firmware from the iComp website to produce the results listed below:


    The Mk3 doesn't seem to read/calculate the EDID info 100% correctly, and I guess it could be the TVs giving out the wrong information. When the Mk3 checks the EDID info it adds only 3 modes to the list, but I think the TVs are capable of a lot more because I've seen them do it when connected to PCs, Raspberry Pis and game consoles. It looks (at least to my not-expert eyes) like the Mk3 is getting most of the information correct for the resolutions it lists, but not the back porch setting, and then because of that, the frequencies are wrong which results in no picture. Below is what the Indivision tool returns, I've highlighted the fields I think are incorrect.


    EDID 1920x1080@66 (Does not work, causes "unsupported signal, check your device output")

    Pixel Clock: 148500000Horizontal Timing: Vertical Timing:
    Visible: 1920 Visible: 1080

    F. Porch: 88 F. Porch: 5

    Sync Len: 44 Sync Len: 5
    B. Porch: 0? B. Porch: 0?

    Polarity: Pos Polarity: Pos

    Freq KHz: 72.3684? Freq Hz: 66.3930?


    EDID 1280x720@65 (Does not work, causes "unsupported signal, check your device output")

    Pixel Clock: 74250000

    Horizontal Timing: Vertical Timing:

    Visible: 1280 Visible: 720

    F. Porch: 110 F. Porch: 5

    Sync Len: 40 Sync Len: 5

    B. Porch: 114? B. Porch: 0?

    Polarity: Pos Polarity: Pos

    Freq KHz: 48.0893? Freq Hz: 65.8758?


    EDID 1280x1024@60 (Works fine.)

    Pixel Clock: 108000000

    Horizontal Timing: Vertical Timing:

    Visible: 1280 Visible: 1024

    F. Porch: 48 F. Porch: 1

    Sync Len: 112 Sync Len: 3

    B. Porch: 248             B. Porch: 38

    Polarity: Pos Polarity: Pos

    Freq KHz: 63.9810 Freq Hz: 60.0197


    Further to this, I tried changing the settings for 1080p to the following, which I think should work and which do produce a picture, just not a stable one...


    TEST 1920x1080@60 (Works but causes screen to blink out every 5-10 seconds for ~1 second)

    Pixel Clock: 148500000

    Horizontal Timing: Vertical Timing:
    Visible: 1920 Visible: 1080

    F. Porch: 88 F. Porch: 5

    Sync Len: 44 Sync Len: 5
    B. Porch: 148             B. Porch: 35

    Polarity: Pos Polarity: Pos

    Freq KHz: 67.5000 Freq Hz: 60.0000


    TEST 1920x1080@50 (Works but causes screen to blink out every 5-10 seconds for ~1 second)

    Pixel Clock: 148500000

    Horizontal Timing: Vertical Timing:
    Visible: 1920 Visible: 1080

    F. Porch: 528 F. Porch: 5

    Sync Len: 44 Sync Len: 5
    B. Porch: 148             B. Porch: 35

    Polarity: Pos Polarity: Pos

    Freq KHz: 67.5000 Freq Hz: 60.0000


    I believe both of these settings should should work fine on the TVs because they are the exact same settings I can see my PC's graphics card sending to the same TVs when I set it to those modes. I just can't figure out why it's not stable when I get the Mk3 to do it. If anyone has any ideas what I'm doing wrong, please let me know.


    The 1024p mode is totally rock solid stable and works great but in order to get the pixels 1-1 I have to set the Mk3 to scale the screen to 640x512 which means I lose the edges of the picture on any games (Battle Squadron, Rainbow Islands etc.) where the Amiga's screen is not left aligned. If I could get either 1080p mode to be stable, I can scale the screen 1-1 with some extra pixels in the margin, to help fix that. Also, I am wondering if using the 50Hz modes on the TVs would help with v-sync and screen tearing? The picture seems tonnes better when the TVs are in game mode, and Mk3 has a clean ratio between input and output resolutions. I've also tried to set up some 576p modes but no luck so far, I get a black screen but no "unsupported signal, check your device output" error message from the TVs, so something is different again with those modes - my reason for trying with those is the lower pixel clock and integer scale for games.

    Again, apologies for the wall of text, I just wanted to write down as much as I could in case it was useful to anyone trying to figure out the same things.

  • if increasing the Mk3 output resolution makes the A1200 do more work somehow and get hot/draw more current?

    That surely happens, so it would be a good thing to know what else is in your system, and what kind of power supply you are using.


    The back porch value not being correct is a good hint - that's something for Timm to check, as (to my knowledge) there is no plausibility check for the data that the monitor sends. We just assume that the monitor knows best what it's capable of showing. That said, we only look at the very first EDID block that is "most compatible" and supported by all monitors. We're aware that there are further blocks in later EDID revisions, so if I had to speculate, it may be that your monitor supports a later EDID version, providing more information, but maybe the data in the first "compatibility block" is not maintained well.


    One thing you might want to check is setting the output to DVI - we have another support case where a pretty modern Dell monitor does not like audio in the HDMI stream, and since an "intermittent picture" might as well point to a problematic blanking signal (where audio is transferred), I'd just like to see if you have a stable picture by switching to DVI-only in the "advanced" settings of the config tool.

  • My A1200 has an ACA-1221lc, a Goex floppy emulator and a CF2IDE+ in it - my Rapid Road is back with you at the mo for repairs, so that's not connected. My PSU is a Pico-PSU based one but external via the power connector. I can check the voltages and try unplugging things later today, after I finish work, if it's helpful?


    I've just tried flipping into DVI mode in 1080p60, and it seems to help on the Workbench screen but not when playing a game - Battle Squadron exhibits the blinking still, and also some intermittant random green horizontal lines. The green lines are much finer than the Amiga's resolution, I think it's the actual 1080p resolution of the panel, so it must be something on the output side doing it. When I drop back to Workbench (640x256@50) it seems fine again.

    Still in DVI mode, I tried Turrican 2 next, and it seems way more stable than Battle Squadron. Only 1 blink so far, when starting the game on first jump, when the screen scrolls. Back to Workbench and stable again. Back to Turrican 2, start game, jump - and screen blinks in the same way. Tried this again another 4-5 times and it's repeatable. Power cycle the Amiga, and again it's still repeatable. There might be two separate things here, the blink is definitely repeatable, the green lines seem a little random. Most of the time I can just occasionally see a couple of single pixels on the Turrican 2 title screen that are flickering - on the character and the power-up block, but occasionally there's a a few line glitches that start part way along the screen and lasts until the end of that line, they last maybe 0.5 seconds. Again these green lines are at the panel's pixels resolution, not the Amigas.


    Then back to Battle Squadron and it starts blinking and glitching straight away but not on the Workbench screen.


    Might some of this be related to the copper/sprites and some craz-ily tight code timings in Battle Squadron multiplexing the heck out of things while in Turrican 2, there's either less going on, or the timings are less critical?


    When I switch back to 1024p, all of these problems go away regardless of whether I'm in DVI mode or not.


    I hope some of this is useful, please let me know if you need me to try anything else.

  • The Mk3 doesn't seem to read/calculate the EDID info 100% correctly, and I guess it could be the TVs giving out the wrong information. When the Mk3 checks the EDID info it adds only 3 modes to the list, but I think the TVs are capable of a lot more because I've seen them do it when connected to PCs, Raspberry Pis and game consoles. It looks (at least to my not-expert eyes) like the Mk3 is getting most of the information correct for the resolutions it lists, but not the back porch setting, and then because of that, the frequencies are wrong which results in no picture. Below is what the Indivision tool returns, I've highlighted the fields I think are incorrect.

    You are correct, I can confirm a bug in the EDID mode evaluation. Thank you for your clear and useful presentation.

    It is also correct in that there are more modes that could be added from evaluating more EDID info from some devices where some of the more interesting "TV modes" are located. First we need to get this bug corrected per an update to the config tool.

  • You are the second person who is reporting "line glitches", and what you have in common with the other person is that you have a Gotek floppy emulator in your computer. Could you please try to remove that and see if the "lines" are gone? Also, please indicate the type of power supply you are using.


    Might some of this be related to the copper/sprites and some craz-ily tight code timings in Battle Squadron multiplexing the heck out of things while in Turrican 2, there's either less going on, or the timings are less critical?

    While these games really push the chipset to it's limits, I suspect a different cause, and that may be that the software is hitting registers with the CPU that are not meant for CPU access at all. As a workaround, you could try to "relax" the number of lines that you can enter in the config tool: PAL has minimum 312 and maximum 313 lines - you could change the minimum to 310 and the maximum to 315 (don't forget to click on "save/apply"), which will allow a few CPU accesses to the strobe registers without the mode being "lost" for a single frame.

  • I tried disconnecting the Gotek/Goex/ completely, both the power and data but it did not seem to change the behaviour in any way. My power supply is based on an 80W Pico-PSU (not a fake one), connected via about 25-30cm of 5-core 1mm2 cable and a square DIN plug to the A1200's power connector. I've got an iComp PSU which I pre-ordered in transit to me at the moment (thank you for that btw), so I can try that too once it arrives.

    I also tried changing the min & max lines to 310 and 315, and this too did not seem to change the behaviour in any way, whether in DVI mode or not.


    Is there anything else that might be worth a shot to help with figure out what's going on?

  • My power supply is based on an 80W Pico-PSU (not a fake one), connected via about 25-30cm of 5-core 1mm2 cable and a square DIN plug to the A1200's power connector.

    That's probably the second-best you can have. Still, the drop in the input filter is not taken care of, so you might need to crank up the voltage at the source a bit to achieve 5.0V inside the computer.


    Since you are obviously soldering yourself, you might want to double-check if there are cold solderjoints on your Lisa - although not very likely to be the cause of software-triggered glitches, there is a remote chance that a loose RGA line makes the flicker fixer believe that one of the strobe registers is accessed, where in reality the CPU accessed a different register. Here's another customer having trouble, seemingly resolved by reflowing all Lisa pins.

  • The new 1.4 config tool seems to be picking up the EDID info correctly now. Thanks very much! :)


    Next job: Try and re-flow, re-clean and re-check the Lisa pins (a little bit scary but should be okay...) to see if that improves the mysterious green lines and display blinking in and out...

  • I think I've solved the riddle of the corrupted graphics and blinking screen... and would you believe it was the darned HDMI cable? I tried everything, I re-cleaned the pins on Lisa, I reflowed the joints, I reseated and rechecked the Indivision Mk3 and nothing changed... and then I moved it to a different TV, swapped the HDMI cable and bingo: the problem was gone. I'm not sure if it was a bad conductor in the cable, or just cross talk but it's definitely the cable, and it only manifests at high bit rates and is worse with DVI disabled (does DVI over HDMI have a lower bit rate or more error correction?). I still have no idea why jumping in Turrican 2 made the screen blink but it was definitly the cable doing it.

    Note for anyone else reading this who is having problems with image stability, even if it's repeatable and seems like it's software triggered... Check any cables, extensions, and connectors before you start yanking your Amiga apart. :)

  • Oh I believe you, sometimes it is the simple things! I'm glad you found the problem.
    HDMI does send more data during the blanking period compared with DVI. So yes it could result in different behavior.

  • Hi All,


    I thought I'd give an update in case the information is useful, and I have a couple of questions too so I can try and understand how things are being done, and hopefully then give better information and feedback - please don't think this message is in any way an "I want a pony! Now!" demand, and as with the previous post, apologies in advance for the wall of text…


    First off, having changed out the HDMI cable and using the Mk3 for the last couple of days, the display now seems totally stable with no picture dropouts, weird green lines, or blue fuzz in the picture when switching the resolution up to 1080p - so I'm still pretty certain it was just the cable causing the issues at higher bit rates. The picture quality seems perfect in both DVI and non-DVI mode. Apologies for repeating myself but I would honestly say to anyone having similar problems, *check your cables* and don't assume if they work at one resolution, then they're fine and not at fault if you get problems at other resolutions.


    The EDID info has also been continuously and correctly detected by the 1.4 config tool over the last few days on all of the HDTVs I’ve tried. As was mentioned previously, I think only a few resolutions are being read/listed by the tool but all resolutions listed work well and produce a stable picture on each of the now 4 different HDTVs I’ve managed to test. As far as I can tell, the resolutions detected seem to be the from DMT monitor compatibility list, and not from the CEA HDTV ones. Is this what you'd expect?


    When I switch between DVI and non-DVI mode, the amount of overscan visible on the TV display changes. DVI mode seems to display the maximum viewable area the Amiga is capable of putting a screen into, but in non-DVI mode, the displayed area seems to zoom in slightly losing some of the outside screen margins, and also part of the Mk3 OSD information (Please see attached pictures). In some resolutions this means that games have part of their screen display moved outside the viewable area. There are no obvious ripples in the pixel sizes that I can see, which would indicate a change in sampling of the Amiga's display in either DVI or non-DVI mode. I can’t tell if less lines are being sent from the Mk3 when DVI is off, or if all the TVs are just processing the incoming signal differently but all of the TVs display the same effect in the same way. My testable TV sample size is not at all big enough but the TVs are different models from different years, 3 are Sony, 1 is from Toshiba, 3 have 1080p panels, and 1 is 720p. It's not impossible that they all have the same chipset and firmware and therefore process the signals the 100% identically but I think it is quite unlikely. The internal scalers on all TVs must be scaling and smoothing things, as I don't see any pixel shimmer when scrolling but I think there should be some as the zooming must have changed at least the vertical pixel height.


    At the moment, I get no audio at all from any of the TVs but all are displaying the small speaker icons in their on-screen info, which indicates they think that audio is being transmitted/received. None of the TVs is showing any indication an error or HDCP protection, I just don't hear anything at all. I *think* DMT modes are not guaranteed to support audio, so if the EDID list is only listing DMT modes, might this be a reason for audio not working for some folks? If not this, then might it be something to do with the audio bit-rate not being what the TVs are expecting, or is it just likely that evil HDCP is kicking in?


    In case extra info helps any, I have several Raspberry Pis I tinker with that have the same issues with DMT modes and audio, (they have a config setting to try to get around the issue). Also, from reading around the subject I don't think Rpi’s do HDCP at all, but their audio always seems to work okay for me in the CEA modes. In addition, I've got a couple of Carby (open source GCVideo) FPGA devices which do a similar job to the Mk3, in scan-doubling and converting GameCube digital video to work on modern TVs. I believe these output video in DVI mode all the time and use a similar hack to the RPi's to get audio to work in that mode (which it does on all the TVs I have). The comments I read whilst browsing the GCVideo source code, it seemed to suggest audio bit-rate was critical to it working but I don't know if any of this is relevant or applicable to the encoder chip on the Mk3.


    At present enabling VSync (with the VSync line set to default value 100) or Auto Resolution causes the picture to disappear completely on all of the TVs I have available to test. The TVs report that the signal goes out of range as soon as either are enabled. Changing the Vsync line value does not seem to change this behaviour.


    Lastly, when I put the Mk3 into 1080p 50Hz and play a game with scrolling, I can see a screen tear ripple about every 10-12 seconds, and I can also see on the Mk3 OSD that it thinks the Amiga’s screen refresh rate is fluctuating between 50-51Hz. I don’t know if this is normal, so I thought I’d let you know in case it’s relevant to any of the above.


    As I said at the start, sorry again for the wall of text, and none of this message is intended as a complaint. I am very, very happy with my Mk3. The picture is great - way, way better than the analogue output, and I am not noticing any additional lag on the screen, so I think whatever there is must be fairly small. 90% of my problems were related to my dodgy HDMI cable, and the other 10% were a config tool issue that was fixed in less than 4 hours on the day I reported it - you can't ask for better service than that.

    I hope some of this info is useful to others.


    Cheers! :)

  • I also have the issue with part of the screen being lost outside the viewable area but only on some games.. very strange and with dvi on.. maybe I’ll try and see what happens with it off

  • At present enabling VSync (with the VSync line set to default value 100) or Auto Resolution causes the picture to disappear completely on all of the TVs I have available to test. The TVs report that the signal goes out of range as soon as either are enabled. Changing the Vsync line value does not seem to change this behaviour.

    The vsync line can be used to move away a bit of tearing if it would occur at the bottom of top edge of the screen. Changing this value won't help if vsync doesn't display anything at all. Did you try auto resolution at Super-hires mode? On my TV here auto-resolution only gives a picture if I set the mode to sample at super-hires (even for hires or lowres pictures). The 28.x pixel clock is too low for the TV to accept, but at super-hires auto-resolution outputs 56 Mhz pixel clock and TV is fine with that.

    Lastly, when I put the Mk3 into 1080p 50Hz and play a game with scrolling, I can see a screen tear ripple about every 10-12 seconds, and I can also see on the Mk3 OSD that it thinks the Amiga’s screen refresh rate is fluctuating between 50-51Hz. I don’t know if this is normal, so I thought I’d let you know in case it’s relevant to any of the above.

    Yes that 50, 51Hz is normal, it counts the number of frames it sees in 1 second for display. It is for user information only the value isn't used to make any decisions. A more accurate way to measure these low frequencies would be to determine the time duration of a frame and take the reciprocal (1/x). However dividing is kinda hard to do in hardware so just accept the +/- 1 error please ;-)

  • When I switch between DVI and non-DVI mode, the amount of overscan visible on the TV display changes. DVI mode seems to display the maximum viewable area the Amiga is capable of putting a screen into, but in non-DVI mode, the displayed area seems to zoom in slightly losing some of the outside screen margins, and also part of the Mk3 OSD information

    Hi, nuttie.

    That's exactly the same problem that I'm having, check my first post on the issue and the messages that follow that one.

    Regarding sound, deactiving "DVI mode" in order to recover audio over HDMI did not work as expected, I get an odd behaviour: playing an MP3 that worked before trying "DVI mode", I got only 1-2 second of sound. Pausing and unpausing, 2 more seconds, and so on.


    Saluditos,


    Ferrán.

  • As far as I can tell, the resolutions detected seem to be the from DMT monitor compatibility list, and not from the CEA HDTV ones. Is this what you'd expect?

    At this point, yet, we are expecting a small subset of what the monitor can do only, as only the highest level of compatibility information is processed by the config tool. We are first implementing the basic functionality, then work our way up to "luxury functions".


    When I switch between DVI and non-DVI mode, the amount of overscan visible on the TV display changes. DVI mode seems to display the maximum viewable area the Amiga is capable of putting a screen into, but in non-DVI mode, the displayed area seems to zoom in slightly losing some of the outside screen margins, and also part of the Mk3 OSD information (Please see attached pictures).

    I haven't been able to reproduce that here, so I assume both you and Ferrán have the same problem with a monitor setting, or even something that the monitor decides on it's own without the possibility for the user to change the behaviour. I guess for DVI, it really does "1:1" pixel display, and for HDMI (which is more or a multimedia, less a computer-associated signal), it switches to "movie mode" or similar. Many TVs have that setting deeply hidden in a menu, That picture processing is the first thing I turn off when I get a new screen.


    Rest assured that the number of lines and pixels sent to the screen is identical in both DVI and non-DVI modes. It is only a single bit that is set/cleared in the encoder chip. All the other decisions are made by your screen(s), so please dig in the menus of those screens. I know that Sony is pretty bad, and might even have the setting "per input device", resulting in the setting not being remembered when you switch between DVI/non-DVI. Since we don't suppot CEC (that's on the nice-to-have-list), the TV might not even store the info at all, as the source device is not recognized with an ID at all.

    If not this, then might it be something to do with the audio bit-rate not being what the TVs are expecting, or is it just likely that evil HDCP is kicking in?

    We do not activate HDCP, so the audio sampling frequency is the best candidate.

    Lastly, when I put the Mk3 into 1080p 50Hz and play a game with scrolling, I can see a screen tear ripple about every 10-12 seconds,

    This is another indicator that your screens are processing the picture, and don't just display 1:1 in both the size and time domain. Vertically-synced output modes (meaning that the output picture is in sync with the computer's vertical sync) is the lowest latency you can get, and any processing in the TV will mess up this property.


    As for animals, Peter is right, Chameleon is the only one we currently offer. We used to have a Catweasel, but that serves a *slightly* different purpose :-)

  • I haven't been able to reproduce that here, so I assume both you and Ferrán have the same problem with a monitor setting, or even something that the monitor decides on it's own without the possibility for the user to change the behaviour. I guess for DVI, it really does "1:1" pixel display, and for HDMI (which is more or a multimedia, less a computer-associated signal), it switches to "movie mode" or similar. Many TVs have that setting deeply hidden in a menu, That picture processing is the first thing I turn off when I get a new screen.

    Hi Jens.

    A friend pointed me out that this odd overscan/oversizing problem could be related to aspect ratio. My Dell U2410, in which I'm having it, is 16:10. I have tried with a monitor of the same family, Dell U2711, same electronics AFAIK, but a different aspect ratio, 16:9, and the image is correctly displayed. Could it be it? If yes, is there any way to solve it without having to resort to the "DVI mode", losing sound in the process? Thanks.

    Saluditos,


    Ferrán.