-
Notifications
You must be signed in to change notification settings - Fork 291
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove 15 bit video for OS X. This mode doesn't seem to work. #38
base: master
Are you sure you want to change the base?
Conversation
This also fixes 2-bit, 4-bit, and 8-bit modes.
#ifdef __APPLE__ | ||
// 15-bit color does not seem to work on OS X | ||
int *last = std::remove(avail_depths, avail_depths + num_depths, 15); | ||
num_depths = ( (size_t)last - (size_t)avail_depths ) / sizeof(int); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can just do:
num_depths = last - avail_depths;
C++'s pointer subtraction semantics will do the right hing.
15-bit depth seems like a very weird thing to support and I wouldn't be surprised if SS simply doesn't work with it, but that XListDepths() doesn't return it on other platforms. I think we can just remove it from the list on all platforms (i.e. without the ifdef).
Also, might as well do it before the std::sort() call above.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing 15-bit mode has the effect of removing the "thousands" color option in OS X. I'm concerned that if 15-bit mode is removed on all platforms, the "thousands" option will no longer be available on other platforms either. However, the "thousands" option is not that useful if you have "millions".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't there still be a 16-bit mode that allows thousands of colors?
Looking up what 15-bit depth means, I guess it's to signify its a 555 rgb encoding (rather than 565 with 16-bit).
Can you provide details of what exactly happens without this? I'm curious what's actually failing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The 16-bit color mode does not appear to function properly in OS X. It causes the image to be distorted. Specifically, the image is compressed horizontally, so the right half of the image is blank. Also, the colors are wrong (there is a purple tint), and there are horizontal lines. Also, when 16-bit color mode is enabled, the same happens for 8, 4, and 2 bit modes. When I disable 16-bit color mode on OS X, I lose 16-bit color, but I gain working 8, 4, and 2 bit color modes.
On Jun 3, 2013, at 1:41 AM, asvitkine [email protected] wrote:
In BasiliskII/src/Unix/video_x.cpp:
@@ -1637,6 +1637,12 @@ bool VideoInit(bool classic)
return false;
}
std::sort(avail_depths, avail_depths + num_depths);
+
+#ifdef APPLE
- // 15-bit color does not seem to work on OS X
- int *last = std::remove(avail_depths, avail_depths + num_depths, 15);
- num_depths = ( (size_t)last - (size_t)avail_depths ) / sizeof(int);
Shouldn't there still be a 16-bit mode that allows thousands of colors?
Looking up what 15-bit depth means, I guess it's to signify its a 555 rgb encoding (rather than 565 with 16-bit).
Can you provide details of what exactly happens without this? I'm curious what's actually failing.
—
Reply to this email directly or view it on GitHub.
15-bit color mode is now removed on all platforms. The removal of 15-bit color mode now happens before the sort. Updated a variable name. Changed list length calculation to use C++'s clever pointer subtraction.
Maybe I do not understand what this is about, but I never noticed a problem with 16-bit color in BasiliskII/SheepShaver in OSX, nor with 8, 4, and 2 bit color. Ronald P. Regensburg. |
Are you running the x-windows version of Basilisk (as opposed to the "native" OS X version)? Which os version are you running? I am running 10.8.3. Also I am using xquartz. Perhaps it is an xquartz bug. From: RonaldPR [email protected] Maybe I do not understand what this is about, but I never noticed a problem with 16-bit color in BasiliskII/SheepShaver in OSX, nor with 8, 4, and 2 bit color. |
We have had the screen turning green in the past with Xwindows. I seem to remember one of the build parameters might solve this issue. Perhaps it was --disable-vosf. |
I always ran the native OSX version, many different builds since 1996, in OSX 10.4 on PPC and in all OSX versions from OSX 10.4 through (now) 10.8.3 on Intel. |
This "fix" only affects the x-windows build. I am no longer able to build the native OSX version on 10.8, although I can run a native version on 10.8 that was built with a previous version of OSX. From: RonaldPR [email protected] I always ran the native OSX version, many different builds since 1996, in OSX 10.4 on PPC and in all OSX versions from OSX 10.4 through (now) 10.8.3 on Intel. |
I'm still a little bit surprised why just having this mode in the list breaks other modes. Can you do a little bit of debugging to understand why that happens? For example, without that change the entry is just there in the list. So it must be something that SheepShaver/Basilisk does with that mode that affects the other modes. Can you determine which call that SheepShaver makes results in problems? |
Also, I just tried it on my 10.6.8 machine and 15-bit mode works fine in an X build. I also confirmed that your change does result in the thousands of colors option being removed. So either way, this doesn't seem like a correct solution. If we do have to remove it, it would have to be based on the version of the OS it's running on. But hopefully, we can find a better fix. |
The issue could be related to XQuartz rather than to the OSX version. |
I believe the x-windows server reports that it supports 15-bit and 32-bit color modes. There are conversion functions that convert 2, 4, and 8-bit color modes to 15-bit and 32-bit color modes. When 15-bit color is available, the other color modes get mapped to 15-bit color mode. My initial fix was just to change the order of the search, so 2, 4, and 8-bit color modes got converted to 32-bit color mode instead of to 15-bit color mode. This is probably more efficient anyway since most people are probably natively running 32-bit color mode. On Jun 3, 2013, at 10:02 PM, asvitkine [email protected] wrote:
|
This also fixes 2-bit, 4-bit, and 8-bit modes.