Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add multiple colorspaces to template comparison #113

Open
choosehappy opened this issue May 21, 2018 · 4 comments
Open

Add multiple colorspaces to template comparison #113

choosehappy opened this issue May 21, 2018 · 4 comments

Comments

@choosehappy
Copy link
Owner


Author Name: Andrew Janowczyk (@choosehappy)
Original Redmine Issue: 197, http://hawking.case.edu:3000/issues/197
Original Date: 2018-05-02


not as easy as it seems, need to know each color spaces min/max value, and in the case of YUV they're not symmetric Y:[0,1] , U,V [-.5,.5]

@nanli-emory
Copy link
Collaborator

Hi @choosehappy I need help opening the issue link from http://hawking.case.edu:3000/issues/197.

Could you provide me with more information about this one?

@choosehappy
Copy link
Owner Author

Not as informative as you'd hope :-\ here is a screenshot:

image

but i can sketch it out quickly. if you look here:

def compareToTemplates(s, params):

you'll see we always do a comparison in the RGB space

a hopefully "simple" feature is to add the ability to perform this computation in a non-RGB colorspace, e.g., YUV, HSV, etc as specified by the user

this is the relevant code from the redmine instance

def computeHistogram(img, bins, mask=-1):
    result = np.zeros(shape=(bins, 3))
    for chan in range(0, 3):
        vals = img[:, :, chan].flatten()
        if (isinstance(mask, np.ndarray)):
            vals = vals[mask.flatten()]
        result[:, chan] = np.histogram(vals, bins=bins, normed=True, range=[0, 255])[0]
    return result


def compareToTemplates(s, params):
    logging.info(f"{s['filename']} - \tcompareToTemplates")
    bins = int(params.get("bins", 20))
    limit_to_mask = strtobool(params.get("limit_to_mask", True))
    to_color_space = params.get("to_color_space", "RGB")

    img = s.getImgThumb(s["image_work_size"])

    img = convert_colorspace(img, "RGB", to_color_space)

    if (limit_to_mask):
        imghst = computeHistogram(img, bins, s["img_mask_use"])
    else:
        imghst = computeHistogram(img, bins)

    for template in params["templates"].splitlines():
        template_key = os.path.splitext(os.path.basename(template))[0]+"_"+to_color_space
        if template_key not in global_holder["templates"].keys():
            img_tmp = io.imread(template)
            img_tmp = convert_colorspace(img_tmp, "RGB", to_color_space)
            global_holder["templates"][template_key] = computeHistogram(img_tmp, bins)
        val = np.sum(pow(abs(global_holder["templates"][template_key] - imghst), 2))
        s.addToPrintList(template_key + "_MSE_hist", str(val))
    return

The question is how to figure out the right bins. since for RGB, we know the min/max values and thus can always split the [0,255] color space into evenly. in YUV, i don't know what the external values are, so its not clear how to normalize the bins

do you see what i mean?

@nanli-emory
Copy link
Collaborator

Hi @choosehappy, since we will use the image thumbnail and it should be not huge in pixels so I will use iterate each channel and find the ranges for each channel.
Do you think it works?

@choosehappy
Copy link
Owner Author

I don't think this will work because you will end up with different values per image, and thus the end results won't be comparable between images?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants