Discussion:
[Lensfun-users] Lens calibration looking for help
Torsten Bronger
2016-07-18 21:39:00 UTC
Permalink
Hallöchen!

After some 400 lens calibrations, I currently don't have the time
for doing further calibrations in time. New job. 58 uploads have
to be processed still, therefore, I am looking for help.

Here is the plan: I set up an ownCloud server and a GitHup repo.
People interested in helping get access to both. Every upload
creates an issue in the repo. The details of the workflow are
outlined here:
https://github.com/lensfun/lensfun/blob/master/tools/calibration-webserver/workflow.rst

I hope to get the calibration business on track again with that.
Besides, a single point of failure never is a good idea. I will
continue to calibrate myself, I just hope to do it not alone.

Over the next days, I will write an HOWTO for calibrators.
Moreover, I will create a new screencast showing how to use the
latest Hugin for this. And of course, I will be responsive in the
issues.

If you are willing to help – much or little – send me an email.

Tschö,
Torsten.
--
Torsten Bronger Jabber ID: ***@jabber.rwth-aachen.de
Jonathan Niehof
2016-07-19 22:20:32 UTC
Permalink
Happy to chip in, although it'll be awhile before I've gone through my
own cameras.

I did take screenshots of distortion workflow through the current
hugin interface; they're at https://github.com/jtniehof/photo (where
I'll be putting more as I work through things worth sharing; e.g. the
SD1100 distortion definitely needs some work.)
Matthias Andree
2016-07-19 23:12:04 UTC
Permalink
Post by Torsten Bronger
Hallöchen!
After some 400 lens calibrations, I currently don't have the time
for doing further calibrations in time. New job. 58 uploads have
to be processed still, therefore, I am looking for help.
Here is the plan: I set up an ownCloud server and a GitHup repo.
People interested in helping get access to both. Every upload
creates an issue in the repo. The details of the workflow are
https://github.com/lensfun/lensfun/blob/master/tools/calibration-webserver/workflow.rst
I hope to get the calibration business on track again with that.
Besides, a single point of failure never is a good idea. I will
continue to calibrate myself, I just hope to do it not alone.
Over the next days, I will write an HOWTO for calibrators.
Moreover, I will create a new screencast showing how to use the
latest Hugin for this. And of course, I will be responsive in the
issues.
If you are willing to help – much or little – send me an email.
Is there a list of lenses/cameras contained in those uploads?
Just to prevent someone taking more calibration images of a combo that
someone already uploaded...

I also looked through the older materials on calibration and have been
scratching my head over the illumination of vignetting test images, how
do you get good ones.
Torsten Bronger
2016-07-20 04:56:24 UTC
Permalink
Hallöchen!
Post by Matthias Andree
[...]
Is there a list of lenses/cameras contained in those uploads?
Just to prevent someone taking more calibration images of a combo
that someone already uploaded...
That's a good point, but I don't have that. On
<http://wilson.bronger.org/lens-list.txt>, you can review the
content of the uploads directory. The names are already changed by
writing EXIF data into the filenames. But still, this list is of
limited use, and I don't like to advertise it. Note that 40% of all
uploads eventually turn out to be unusable. Mostly I get better
ones from the original uploader, but not always. Besides, in some
cases, the actual lens model is found out only during the
calibration.

My goal is to have so few uploads pending that a collision is very
unlikely. It used to be like that.
Post by Matthias Andree
I also looked through the older materials on calibration and have
been scratching my head over the illumination of vignetting test
images, how do you get good ones.
It is less critical than it seems. A diffuser in front of the lens
is always necessary, but in contrast to earlier texts of mine, I now
think it can even be paper (if absolutely level of course). The
actual ilumination should not cast a visible gradiant of the
diffuser but unless somebody points a laser pointer to it, I think
this will work in all practical cases.

Tschö,
Torsten.
--
Torsten Bronger Jabber ID: ***@jabber.rwth-aachen.de
j***@yepmail.net
2016-07-20 07:01:56 UTC
Permalink
Post by Torsten Bronger
Post by Matthias Andree
I also looked through the older materials on calibration and have
been scratching my head over the illumination of vignetting test
images, how do you get good ones.
It is less critical than it seems. A diffuser in front of the lens
is always necessary, but in contrast to earlier texts of mine, I now
think it can even be paper (if absolutely level of course). The
actual ilumination should not cast a visible gradiant of the
diffuser but unless somebody points a laser pointer to it, I think
this will work in all practical cases.
I found another interesting possibility for this recently, in the form of OLED mobile device screens. As mentioned in the existing tutorial, backlit displays don't make good sources of even lighting, but the OLED display on my current phone does a great job of this, as it turns out, when set to display an all-white screen[1]. Of course, it helps that my micro four-thirds lenses have fairly small front elements... and you would want to make sure your device screen is actually flat...

Anyway, I'll try to get to some of the calibration chores too, pending updated hugin instructions.

[1] handy android app for full-white screen: https://f-droid.org/repository/browse/?fdid=org.bc_bd.mrwhite
--
jys
Torsten Bronger
2016-07-20 07:44:56 UTC
Permalink
Hallöchen!
Post by j***@yepmail.net
Post by Matthias Andree
I also looked through the older materials on calibration and
have been scratching my head over the illumination of vignetting
test images, how do you get good ones.
[...]
I found another interesting possibility for this recently, in the
form of OLED mobile device screens. [...]
I haven't had an OLED screen in my hand so far but evenness is only
one thing. The other thing is independence of direction. If you
tilt the screen, it must have the same brightness, with a tolerance
smaller than what could be evaluated with the naked eye.

Therefore FWIW, I would not accept images taken with an OLED screen.
If you put a sheet of paper inbeween, it might be okay, though.
Post by j***@yepmail.net
Anyway, I'll try to get to some of the calibration chores too,
pending updated hugin instructions.
So I should send you access credentials?

Tschö,
Torsten.
--
Torsten Bronger Jabber ID: ***@jabber.rwth-aachen.de
j***@yepmail.net
2016-07-20 18:01:12 UTC
Permalink
Post by Torsten Bronger
I haven't had an OLED screen in my hand so far but evenness is only
one thing. The other thing is independence of direction. If you
tilt the screen, it must have the same brightness, with a tolerance
smaller than what could be evaluated with the naked eye.
Therefore FWIW, I would not accept images taken with an OLED screen.
If you put a sheet of paper inbeween, it might be okay, though.
Yes, a sheet of paper or other diffuser should still be used (I was using a piece of Roscolux diffuser gel), even if only because of the clear glass in front of the actual light source, which could otherwise let ambient light in around the edges... sorry, I didn't mean to imply otherwise.

Maybe the next time I'm profiling a lens for vignetting I'll try various experimental methods along with the less convenient known-good ones so the results can be compared, since this is something that people (including me) have questions about.
Post by Torsten Bronger
So I should send you access credentials?
Sure... I would feel more confident if there could also be some kind of peer-review among the people involved, even if only informally. I'll be busy for possibly the next week, and need to rebuild the deps for the script on a shiny new Slackware system, but should be able to contribute after that.

As a side note, while you're updating the profiling instructions, it might be worth expanding a little on the importance of profiling vignetting at different focus distances depending on lens type. I found, for instance, that applying the correction for infinity produced really bad overcorrection at close focus distances on the Olympus M.25mm F1.8, but I'm not sure how generally this is true of similar internally-focusing lenses... and I'm guessing that for external-focus types probably it's a non-issue, as implied in the current tutorial...
--
jys
Torsten Bronger
2016-07-21 17:26:11 UTC
Permalink
Hallöchen!
Post by j***@yepmail.net
[...]
Post by Torsten Bronger
So I should send you access credentials?
Sure... I would feel more confident if there could also be some
kind of peer-review among the people involved, even if only
informally.
I check the incoming XML for correct syntax and plausibility.
Post by j***@yepmail.net
[...]
As a side note, while you're updating the profiling instructions,
it might be worth expanding a little on the importance of
profiling vignetting at different focus distances depending on
lens type. I found, for instance, that applying the correction for
infinity produced really bad overcorrection at close focus
distances on the Olympus M.25mm F1.8, but I'm not sure how
generally this is true of similar internally-focusing
lenses... and I'm guessing that for external-focus types probably
it's a non-issue, as implied in the current tutorial...
People who see a problem should consider sending sample images. I
don't want to give the impression that it is important because it
would raise the theshold for vignetting images even higher. It is
difficult to get people doing this.

Tschö,
Torsten.
--
Torsten Bronger Jabber ID: ***@jabber.rwth-aachen.de
j***@yepmail.net
2016-07-26 16:35:51 UTC
Permalink
Post by Torsten Bronger
Post by j***@yepmail.net
As a side note, while you're updating the profiling instructions,
it might be worth expanding a little on the importance of
profiling vignetting at different focus distances depending on
lens type. I found, for instance, that applying the correction for
infinity produced really bad overcorrection at close focus
distances on the Olympus M.25mm F1.8, but I'm not sure how
generally this is true of similar internally-focusing
lenses... and I'm guessing that for external-focus types probably
it's a non-issue, as implied in the current tutorial...
People who see a problem should consider sending sample images. I
don't want to give the impression that it is important because it
would raise the theshold for vignetting images even higher. It is
difficult to get people doing this.
That's understandable, although I somewhat suspect that the biggest hurdle for people is getting set up to take the vignetting images in the first place...

Anyway, after my first attempt at attacking the backlog, I have a few comments/questions:

Regarding the calibrate.py script, I had an "interesting" time getting the dependencies in line... it seems that it's been active times in the world of linear algebra software lately, involving much breakage of backwards compatibility... to make a long story short, I had to fall back to older versions of numpy and lapack (1.8.2 and 3.3.1, respectively) to build a scipy that played nicely with the script... in case that's useful information for anybody else trying to make it work. Also, the link to information about names for makers and mounts in the the script is 404, and since it's a shortened link, I'm not sure what it pointed to, exactly.

Rather than do a full checkout of the ownCloud repository, I just downloaded the directory I wanted, then uploaded the resulting lenses.txt and lensfun.xml files after calibration, as well as moving the files to a newly created "distortion" directory, all using the web interface. Since the same files were used for tca, I just created that directory and left a note in it to that effect, rather than duplicate the files. The instructions seem to indicate that uploading the other resulting files (.pto, etc) isn't desired, so I didn't. Is this ok, or should I do something differently?

Also, found this tutorial helpful, and didn't notice them linked in any of the instructions:

http://hugin.sourceforge.net/tutorials/calibration/en.shtml

Since recent hugin versions don't display the correction parameters in a copy-friendly manner, I just exported an .ini lens data file for each of the focal lengths, then used the attached quick'n'dirty script to scrape the data into the lenses.txt file, in case anybody else finds that method useful.

I'll try to get to some more lenses once I'm sure I'm not Doing It Wrong.

Cheers!
jys
Torsten Bronger
2016-08-01 06:50:15 UTC
Permalink
Hallöchen!
Post by j***@yepmail.net
[...]
Regarding the calibrate.py script, I had an "interesting" time
getting the dependencies in line... it seems that it's been active
times in the world of linear algebra software lately, involving
much breakage of backwards compatibility... to make a long story
short, I had to fall back to older versions of numpy and lapack
(1.8.2 and 3.3.1, respectively) to build a scipy that played
nicely with the script... in case that's useful information for
anybody else trying to make it work.
"leastsq" is a very old and basic function from scipy, and the only
one calibrate.py calls. Any idea about the cause of this trouble?
Post by j***@yepmail.net
Also, the link to information about names for makers and mounts in
the the script is 404, and since it's a shortened link, I'm not
sure what it pointed to, exactly.
Yes, indeed. It is fixed.
Post by j***@yepmail.net
Rather than do a full checkout of the ownCloud repository, I just
downloaded the directory I wanted, then uploaded the resulting
lenses.txt and lensfun.xml files after calibration, as well as
moving the files to a newly created "distortion" directory, all
using the web interface. Since the same files were used for tca, I
just created that directory and left a note in it to that effect,
rather than duplicate the files. The instructions seem to indicate
that uploading the other resulting files (.pto, etc) isn't
desired, so I didn't. Is this ok, or should I do something
differently?
If it works for you, it sounds fine – I am not an expert with
ownCloud.
Post by j***@yepmail.net
Also, found this tutorial helpful, and didn't notice them linked
http://hugin.sourceforge.net/tutorials/calibration/en.shtml
Yes, this is the origin of the method I recommend.
Post by j***@yepmail.net
Since recent hugin versions don't display the correction
parameters in a copy-friendly manner, I just exported an .ini lens
data file for each of the focal lengths, then used the attached
quick'n'dirty script to scrape the data into the lenses.txt file,
in case anybody else finds that method useful.
I right-click on the correction parameters and choose "edit image
parameters". Then, I can copy-and-paste. Not perfect but works for
me.

Tschö,
Torsten.
--
Torsten Bronger Jabber ID: ***@jabber.rwth-aachen.de


------------------------------------------------------------------------------
j***@yepmail.net
2016-08-01 21:42:04 UTC
Permalink
Post by Torsten Bronger
"leastsq" is a very old and basic function from scipy, and the only
one calibrate.py calls. Any idea about the cause of this trouble?
I didn't go any further down that rabbit hole than I needed to in order to get things working, but from what I gathered, things changed in numpy 1.9.0 and in lapack 3.5.0, and supposedly a recent enough version of scipy should be patched up against all of it, but in my case it was still segfaulting because of missing symbols. Falling back to older versions of the deps got things working, finally, at which point I was happy to stop thinking about any of it.

Possibly relevant discussion:
https://github.com/scipy/scipy/issues/5266
Post by Torsten Bronger
If it works for you, it sounds fine – I am not an expert with
ownCloud.
The web interface seems to have the basic functionality for the task... enough so that I prefer it to maintaining such a large local share, personally. Others may also find this a nice option to consider, I guess.
Post by Torsten Bronger
I right-click on the correction parameters and choose "edit image
parameters". Then, I can copy-and-paste. Not perfect but works for
me.
I did find the parameter editing screen via another route, but not being much of a mouse user, I'm trying to save it for control point selection. ;)
--
jys

------------------------------------------------------------------------------
Dave
2016-07-19 04:45:57 UTC
Permalink
Hello.
I would like to help calibrate.
The HOWTO will be necessary though ; have not done this before.

Regards

Dave
Hallöchen!
After some 400 lens calibrations, I currently don't have the time
for doing further calibrations in time. New job. 58 uploads have
to be processed still, therefore, I am looking for help.
Here is the plan: I set up an ownCloud server and a GitHup repo.
People interested in helping get access to both. Every upload
creates an issue in the repo. The details of the workflow are
https://github.com/lensfun/lensfun/blob/master/tools/calibration-webserver/workflow.rst
I hope to get the calibration business on track again with that.
Besides, a single point of failure never is a good idea. I will
continue to calibrate myself, I just hope to do it not alone.
Over the next days, I will write an HOWTO for calibrators.
Moreover, I will create a new screencast showing how to use the
latest Hugin for this. And of course, I will be responsive in the
issues.
If you are willing to help – much or little – send me an email.
Tschö,
Torsten.
--
____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to
Loading...