Looking for the proper algorithm to compress many lowres images of nearby locations

Data Science Asked by user1282931 on September 22, 2020

I have an optimization problem that I’m looking for the right algorithm to solve.

What I have: A large set of low-res 360 images that were taken on a regular grid within a certain area. each of these images is quite sparsely sampled and each of these 360 images has an accurate XYZ position assigned of its center. There are millions of these small images, clusters of close-by images obviously share a lot of information while images farther apart can be completely different.

What I want to do is to compress these small 360 images.

If two 360 images are close by each other, they can be ‘warped’ into each other by projecting it onto a sphere of finite distance and then moving that sphere (so a closeby 360 image can be a good aproximation of another 360 image when it has been warped that way).

Based on this idea, I want to compress these small low-res 360 images by replacing each of them with:

  • N (N being something like 2-5) indices into an archive of M (M being something like 50-500) different ‘prototype’ images (of possibly higher resolution than the low res 360 images), each of which has an XZY location assigned plus a radius
  • N blend weights

Such that if I want to reconstruct one of the small, sparsely sampled 360 images I take the N indices stored for this image, look-up the corresponding prototype images from the archive, warp them based on the radius of the archive image and the delta vector of archive XZY and compressed image XYZ location, and then blend the N prototype images based on the N blend weights (and possibly scale down in the prototype images are higher res)

I guess this goes into the direction of Eigen Faces, but with Eigen faces each compressed face has a weight stored for each eigen-face, whereas I want that each compressed sphere only has N non-zero weights.

So my input is:
a lot of small 360 images plus a XYZ location each

my output should be:

  • an archive of M "prototype" images, each assigned an XYZ location and a projection radius
  • all compressed spheres, with each sphere compressed to N indices and N weights

This seems to be some non-linear least squares problem, but I wonder if someone can point me into the right direction on how to solve this?

As a completely alternative approach I also looked into spherical harmonics, but with those I only get enough high-frequency details at l=6 which takes 36 coefficients which is too much and also too slow to decompress.

Add your own answers!

Related Questions

Clustering data with a constraint

0  Asked on February 2, 2021 by aibek


Suspiciously good accuracy using neural network

1  Asked on February 1, 2021 by po-chen-liu


Question about reshaping array size for KNN Classifiers

1  Asked on January 31, 2021 by leeann-capistran


NASDAQ Trade Data

3  Asked on January 31, 2021 by marin


Supported GPU for Pytorch

1  Asked on January 31, 2021


Ask a Question

Get help from others!

© 2022 All rights reserved. Sites we Love: PCI Database, MenuIva, UKBizDB, Menu Kuliner, Sharing RPP, SolveDir