提交 2ab7fb60 编写于 作者: V Vadim Pisarevsky

Merge pull request #3017 from f-morozov:akaze

.. _akazeMatching:
AKAZE local features matching
******************************
Introduction
------------------
In this tutorial we will learn how to use [AKAZE]_ local features to detect and match keypoints on two images.
We will find keypoints on a pair of images with given homography matrix,
match them and count the number of inliers (i. e. matches that fit in the given homography).
You can find expanded version of this example here: https://github.com/pablofdezalc/test_kaze_akaze_opencv
.. [AKAZE] Fast Explicit Diffusion for Accelerated Features in Nonlinear Scale Spaces. Pablo F. Alcantarilla, Jesús Nuevo and Adrien Bartoli. In British Machine Vision Conference (BMVC), Bristol, UK, September 2013.
Data
------------------
We are going to use images 1 and 3 from *Graffity* sequence of Oxford dataset.
.. image:: images/graf.png
:height: 200pt
:width: 320pt
:alt: Graffity
:align: center
Homography is given by a 3 by 3 matrix:
.. code-block:: none
7.6285898e-01 -2.9922929e-01 2.2567123e+02
3.3443473e-01 1.0143901e+00 -7.6999973e+01
3.4663091e-04 -1.4364524e-05 1.0000000e+00
You can find the images (*graf1.png*, *graf3.png*) and homography (*H1to3p.xml*) in *opencv/samples/cpp*.
Source Code
===========
.. literalinclude:: ../../../../samples/cpp/tutorial_code/features2D/AKAZE_match.cpp
:language: cpp
:linenos:
:tab-width: 4
Explanation
===========
1. **Load images and homography**
.. code-block:: cpp
Mat img1 = imread("graf1.png", IMREAD_GRAYSCALE);
Mat img2 = imread("graf3.png", IMREAD_GRAYSCALE);
Mat homography;
FileStorage fs("H1to3p.xml", FileStorage::READ);
fs.getFirstTopLevelNode() >> homography;
We are loading grayscale images here. Homography is stored in the xml created with FileStorage.
2. **Detect keypoints and compute descriptors using AKAZE**
.. code-block:: cpp
vector<KeyPoint> kpts1, kpts2;
Mat desc1, desc2;
AKAZE akaze;
akaze(img1, noArray(), kpts1, desc1);
akaze(img2, noArray(), kpts2, desc2);
We create AKAZE object and use it's *operator()* functionality. Since we don't need the *mask* parameter, *noArray()* is used.
3. **Use brute-force matcher to find 2-nn matches**
.. code-block:: cpp
BFMatcher matcher(NORM_HAMMING);
vector< vector<DMatch> > nn_matches;
matcher.knnMatch(desc1, desc2, nn_matches, 2);
We use Hamming distance, because AKAZE uses binary descriptor by default.
4. **Use 2-nn matches to find correct keypoint matches**
.. code-block:: cpp
for(size_t i = 0; i < nn_matches.size(); i++) {
DMatch first = nn_matches[i][0];
float dist1 = nn_matches[i][0].distance;
float dist2 = nn_matches[i][1].distance;
if(dist1 < nn_match_ratio * dist2) {
matched1.push_back(kpts1[first.queryIdx]);
matched2.push_back(kpts2[first.trainIdx]);
}
}
If the closest match is *ratio* closer than the second closest one, then the match is correct.
5. **Check if our matches fit in the homography model**
.. code-block:: cpp
for(int i = 0; i < matched1.size(); i++) {
Mat col = Mat::ones(3, 1, CV_64F);
col.at<double>(0) = matched1[i].pt.x;
col.at<double>(1) = matched1[i].pt.y;
col = homography * col;
col /= col.at<double>(2);
float dist = sqrt( pow(col.at<double>(0) - matched2[i].pt.x, 2) +
pow(col.at<double>(1) - matched2[i].pt.y, 2));
if(dist < inlier_threshold) {
int new_i = inliers1.size();
inliers1.push_back(matched1[i]);
inliers2.push_back(matched2[i]);
good_matches.push_back(DMatch(new_i, new_i, 0));
}
}
If the distance from first keypoint's projection to the second keypoint is less than threshold, then it it fits in the homography.
We create a new set of matches for the inliers, because it is required by the drawing function.
6. **Output results**
.. code-block:: cpp
Mat res;
drawMatches(img1, inliers1, img2, inliers2, good_matches, res);
imwrite("res.png", res);
...
Here we save the resulting image and print some statistics.
Results
=======
Found matches
--------------
.. image:: images/res.png
:height: 200pt
:width: 320pt
:alt: Matches
:align: center
A-KAZE Matching Results
--------------------------
Keypoints 1: 2943
Keypoints 2: 3511
Matches: 447
Inliers: 308
Inliers Ratio: 0.689038
......@@ -183,6 +183,25 @@ Learn about how to use the feature points detectors, descriptors and matching f
:height: 90pt
:width: 90pt
+
.. tabularcolumns:: m{100pt} m{300pt}
.. cssclass:: toctableopencv
===================== ==============================================
|AkazeMatch| **Title:** :ref:`akazeMatching`
*Compatibility:* > OpenCV 3.0
*Author:* Fedor Morozov
Use *AKAZE* local features to find correspondence between two images.
===================== ==============================================
.. |AkazeMatch| image:: images/AKAZE_Match_Tutorial_Cover.png
:height: 90pt
:width: 90pt
.. raw:: latex
\pagebreak
......@@ -201,3 +220,4 @@ Learn about how to use the feature points detectors, descriptors and matching f
../feature_flann_matcher/feature_flann_matcher
../feature_homography/feature_homography
../detection_of_planar_objects/detection_of_planar_objects
../akaze_matching/akaze_matching
......@@ -31,7 +31,7 @@ Detects corners using the FAST algorithm
Detects corners using the FAST algorithm by [Rosten06]_.
..note:: In Python API, types are given as ``cv2.FAST_FEATURE_DETECTOR_TYPE_5_8``, ``cv2.FAST_FEATURE_DETECTOR_TYPE_7_12`` and ``cv2.FAST_FEATURE_DETECTOR_TYPE_9_16``. For corner detection, use ``cv2.FAST.detect()`` method.
.. note:: In Python API, types are given as ``cv2.FAST_FEATURE_DETECTOR_TYPE_5_8``, ``cv2.FAST_FEATURE_DETECTOR_TYPE_7_12`` and ``cv2.FAST_FEATURE_DETECTOR_TYPE_9_16``. For corner detection, use ``cv2.FAST.detect()`` method.
.. [Rosten06] E. Rosten. Machine Learning for High-speed Corner Detection, 2006.
......@@ -254,7 +254,17 @@ KAZE
----
.. ocv:class:: KAZE : public Feature2D
Class implementing the KAZE keypoint detector and descriptor extractor, described in [ABD12]_.
Class implementing the KAZE keypoint detector and descriptor extractor, described in [ABD12]_. ::
class CV_EXPORTS_W KAZE : public Feature2D
{
public:
CV_WRAP KAZE();
CV_WRAP explicit KAZE(bool extended, bool upright, float threshold = 0.001f,
int octaves = 4, int sublevels = 4, int diffusivity = DIFF_PM_G2);
};
.. note:: AKAZE descriptor can only be used with KAZE or AKAZE keypoints
.. [ABD12] KAZE Features. Pablo F. Alcantarilla, Adrien Bartoli and Andrew J. Davison. In European Conference on Computer Vision (ECCV), Fiorenze, Italy, October 2012.
......@@ -262,12 +272,14 @@ KAZE::KAZE
----------
The KAZE constructor
.. ocv:function:: KAZE::KAZE(bool extended, bool upright)
.. ocv:function:: KAZE::KAZE(bool extended, bool upright, float threshold, int octaves, int sublevels, int diffusivity)
:param extended: Set to enable extraction of extended (128-byte) descriptor.
:param upright: Set to enable use of upright descriptors (non rotation-invariant).
:param threshold: Detector response threshold to accept point
:param octaves: Maximum octave evolution of the image
:param sublevels: Default number of sublevels per scale level
:param diffusivity: Diffusivity type. DIFF_PM_G1, DIFF_PM_G2, DIFF_WEICKERT or DIFF_CHARBONNIER
AKAZE
-----
......@@ -278,25 +290,25 @@ Class implementing the AKAZE keypoint detector and descriptor extractor, describ
class CV_EXPORTS_W AKAZE : public Feature2D
{
public:
/// AKAZE Descriptor Type
enum DESCRIPTOR_TYPE {
DESCRIPTOR_KAZE_UPRIGHT = 2, ///< Upright descriptors, not invariant to rotation
DESCRIPTOR_KAZE = 3,
DESCRIPTOR_MLDB_UPRIGHT = 4, ///< Upright descriptors, not invariant to rotation
DESCRIPTOR_MLDB = 5
};
CV_WRAP AKAZE();
explicit AKAZE(DESCRIPTOR_TYPE descriptor_type, int descriptor_size = 0, int descriptor_channels = 3);
CV_WRAP explicit AKAZE(int descriptor_type, int descriptor_size = 0, int descriptor_channels = 3,
float threshold = 0.001f, int octaves = 4, int sublevels = 4, int diffusivity = DIFF_PM_G2);
};
.. note:: AKAZE descriptor can only be used with KAZE or AKAZE keypoints
.. [ANB13] Fast Explicit Diffusion for Accelerated Features in Nonlinear Scale Spaces. Pablo F. Alcantarilla, Jesús Nuevo and Adrien Bartoli. In British Machine Vision Conference (BMVC), Bristol, UK, September 2013.
AKAZE::AKAZE
------------
The AKAZE constructor
.. ocv:function:: AKAZE::AKAZE(DESCRIPTOR_TYPE descriptor_type, int descriptor_size = 0, int descriptor_channels = 3)
.. ocv:function:: AKAZE::AKAZE(int descriptor_type, int descriptor_size, int descriptor_channels, float threshold, int octaves, int sublevels, int diffusivity)
:param descriptor_type: Type of the extracted descriptor.
:param descriptor_type: Type of the extracted descriptor: DESCRIPTOR_KAZE, DESCRIPTOR_KAZE_UPRIGHT, DESCRIPTOR_MLDB or DESCRIPTOR_MLDB_UPRIGHT.
:param descriptor_size: Size of the descriptor in bits. 0 -> Full size
:param descriptor_channels: Number of channels in the descriptor (1, 2, 3).
:param descriptor_channels: Number of channels in the descriptor (1, 2, 3)
:param threshold: Detector response threshold to accept point
:param octaves: Maximum octave evolution of the image
:param sublevels: Default number of sublevels per scale level
:param diffusivity: Diffusivity type. DIFF_PM_G1, DIFF_PM_G2, DIFF_WEICKERT or DIFF_CHARBONNIER
......@@ -895,6 +895,22 @@ protected:
PixelTestFn test_fn_;
};
// KAZE/AKAZE diffusivity
enum {
DIFF_PM_G1 = 0,
DIFF_PM_G2 = 1,
DIFF_WEICKERT = 2,
DIFF_CHARBONNIER = 3
};
// AKAZE descriptor type
enum {
DESCRIPTOR_KAZE_UPRIGHT = 2, ///< Upright descriptors, not invariant to rotation
DESCRIPTOR_KAZE = 3,
DESCRIPTOR_MLDB_UPRIGHT = 4, ///< Upright descriptors, not invariant to rotation
DESCRIPTOR_MLDB = 5
};
/*!
KAZE implementation
*/
......@@ -902,7 +918,8 @@ class CV_EXPORTS_W KAZE : public Feature2D
{
public:
CV_WRAP KAZE();
CV_WRAP explicit KAZE(bool extended, bool upright);
CV_WRAP explicit KAZE(bool extended, bool upright, float threshold = 0.001f,
int octaves = 4, int sublevels = 4, int diffusivity = DIFF_PM_G2);
virtual ~KAZE();
......@@ -928,6 +945,10 @@ protected:
CV_PROP bool extended;
CV_PROP bool upright;
CV_PROP float threshold;
CV_PROP int octaves;
CV_PROP int sublevels;
CV_PROP int diffusivity;
};
/*!
......@@ -936,16 +957,9 @@ AKAZE implementation
class CV_EXPORTS_W AKAZE : public Feature2D
{
public:
/// AKAZE Descriptor Type
enum DESCRIPTOR_TYPE {
DESCRIPTOR_KAZE_UPRIGHT = 2, ///< Upright descriptors, not invariant to rotation
DESCRIPTOR_KAZE = 3,
DESCRIPTOR_MLDB_UPRIGHT = 4, ///< Upright descriptors, not invariant to rotation
DESCRIPTOR_MLDB = 5
};
CV_WRAP AKAZE();
explicit AKAZE(DESCRIPTOR_TYPE descriptor_type, int descriptor_size = 0, int descriptor_channels = 3);
CV_WRAP explicit AKAZE(int descriptor_type, int descriptor_size = 0, int descriptor_channels = 3,
float threshold = 0.001f, int octaves = 4, int sublevels = 4, int diffusivity = DIFF_PM_G2);
virtual ~AKAZE();
......@@ -973,7 +987,10 @@ protected:
CV_PROP int descriptor;
CV_PROP int descriptor_channels;
CV_PROP int descriptor_size;
CV_PROP float threshold;
CV_PROP int octaves;
CV_PROP int sublevels;
CV_PROP int diffusivity;
};
/****************************************************************************************\
* Distance *
......
......@@ -49,7 +49,10 @@ http://www.robesafe.com/personal/pablo.alcantarilla/papers/Alcantarilla13bmvc.pd
*/
#include "precomp.hpp"
#include "akaze/AKAZEFeatures.h"
#include "kaze/AKAZEFeatures.h"
#include <iostream>
using namespace std;
namespace cv
{
......@@ -57,13 +60,22 @@ namespace cv
: descriptor(DESCRIPTOR_MLDB)
, descriptor_channels(3)
, descriptor_size(0)
, threshold(0.001f)
, octaves(4)
, sublevels(4)
, diffusivity(DIFF_PM_G2)
{
}
AKAZE::AKAZE(DESCRIPTOR_TYPE _descriptor_type, int _descriptor_size, int _descriptor_channels)
AKAZE::AKAZE(int _descriptor_type, int _descriptor_size, int _descriptor_channels,
float _threshold, int _octaves, int _sublevels, int _diffusivity)
: descriptor(_descriptor_type)
, descriptor_channels(_descriptor_channels)
, descriptor_size(_descriptor_size)
, threshold(_threshold)
, octaves(_octaves)
, sublevels(_sublevels)
, diffusivity(_diffusivity)
{
}
......@@ -78,12 +90,12 @@ namespace cv
{
switch (descriptor)
{
case cv::AKAZE::DESCRIPTOR_KAZE:
case cv::AKAZE::DESCRIPTOR_KAZE_UPRIGHT:
case cv::DESCRIPTOR_KAZE:
case cv::DESCRIPTOR_KAZE_UPRIGHT:
return 64;
case cv::AKAZE::DESCRIPTOR_MLDB:
case cv::AKAZE::DESCRIPTOR_MLDB_UPRIGHT:
case cv::DESCRIPTOR_MLDB:
case cv::DESCRIPTOR_MLDB_UPRIGHT:
// We use the full length binary descriptor -> 486 bits
if (descriptor_size == 0)
{
......@@ -106,12 +118,12 @@ namespace cv
{
switch (descriptor)
{
case cv::AKAZE::DESCRIPTOR_KAZE:
case cv::AKAZE::DESCRIPTOR_KAZE_UPRIGHT:
case cv::DESCRIPTOR_KAZE:
case cv::DESCRIPTOR_KAZE_UPRIGHT:
return CV_32F;
case cv::AKAZE::DESCRIPTOR_MLDB:
case cv::AKAZE::DESCRIPTOR_MLDB_UPRIGHT:
case cv::DESCRIPTOR_MLDB:
case cv::DESCRIPTOR_MLDB_UPRIGHT:
return CV_8U;
default:
......@@ -124,12 +136,12 @@ namespace cv
{
switch (descriptor)
{
case cv::AKAZE::DESCRIPTOR_KAZE:
case cv::AKAZE::DESCRIPTOR_KAZE_UPRIGHT:
case cv::DESCRIPTOR_KAZE:
case cv::DESCRIPTOR_KAZE_UPRIGHT:
return cv::NORM_L2;
case cv::AKAZE::DESCRIPTOR_MLDB:
case cv::AKAZE::DESCRIPTOR_MLDB_UPRIGHT:
case cv::DESCRIPTOR_MLDB:
case cv::DESCRIPTOR_MLDB_UPRIGHT:
return cv::NORM_HAMMING;
default:
......@@ -153,11 +165,15 @@ namespace cv
cv::Mat& desc = descriptors.getMatRef();
AKAZEOptions options;
options.descriptor = static_cast<DESCRIPTOR_TYPE>(descriptor);
options.descriptor = descriptor;
options.descriptor_channels = descriptor_channels;
options.descriptor_size = descriptor_size;
options.img_width = img.cols;
options.img_height = img.rows;
options.dthreshold = threshold;
options.omax = octaves;
options.nsublevels = sublevels;
options.diffusivity = diffusivity;
AKAZEFeatures impl(options);
impl.Create_Nonlinear_Scale_Space(img1_32);
......@@ -188,7 +204,7 @@ namespace cv
img.convertTo(img1_32, CV_32F, 1.0 / 255.0, 0);
AKAZEOptions options;
options.descriptor = static_cast<DESCRIPTOR_TYPE>(descriptor);
options.descriptor = descriptor;
options.descriptor_channels = descriptor_channels;
options.descriptor_size = descriptor_size;
options.img_width = img.cols;
......@@ -216,7 +232,7 @@ namespace cv
cv::Mat& desc = descriptors.getMatRef();
AKAZEOptions options;
options.descriptor = static_cast<DESCRIPTOR_TYPE>(descriptor);
options.descriptor = descriptor;
options.descriptor_channels = descriptor_channels;
options.descriptor_size = descriptor_size;
options.img_width = img.cols;
......@@ -229,4 +245,4 @@ namespace cv
CV_Assert((!desc.rows || desc.cols == descriptorSize()));
CV_Assert((!desc.rows || (desc.type() == descriptorType())));
}
}
\ No newline at end of file
}
/**
* @file AKAZE.h
* @brief Main class for detecting and computing binary descriptors in an
* accelerated nonlinear scale space
* @date Mar 27, 2013
* @author Pablo F. Alcantarilla, Jesus Nuevo
*/
#pragma once
/* ************************************************************************* */
// Includes
#include "precomp.hpp"
#include "AKAZEConfig.h"
/* ************************************************************************* */
// AKAZE Class Declaration
class AKAZEFeatures {
private:
AKAZEOptions options_; ///< Configuration options for AKAZE
std::vector<TEvolution> evolution_; ///< Vector of nonlinear diffusion evolution
/// FED parameters
int ncycles_; ///< Number of cycles
bool reordering_; ///< Flag for reordering time steps
std::vector<std::vector<float > > tsteps_; ///< Vector of FED dynamic time steps
std::vector<int> nsteps_; ///< Vector of number of steps per cycle
/// Matrices for the M-LDB descriptor computation
cv::Mat descriptorSamples_; // List of positions in the grids to sample LDB bits from.
cv::Mat descriptorBits_;
cv::Mat bitMask_;
public:
/// Constructor with input arguments
AKAZEFeatures(const AKAZEOptions& options);
/// Scale Space methods
void Allocate_Memory_Evolution();
int Create_Nonlinear_Scale_Space(const cv::Mat& img);
void Feature_Detection(std::vector<cv::KeyPoint>& kpts);
void Compute_Determinant_Hessian_Response(void);
void Compute_Multiscale_Derivatives(void);
void Find_Scale_Space_Extrema(std::vector<cv::KeyPoint>& kpts);
void Do_Subpixel_Refinement(std::vector<cv::KeyPoint>& kpts);
// Feature description methods
void Compute_Descriptors(std::vector<cv::KeyPoint>& kpts, cv::Mat& desc);
static void Compute_Main_Orientation(cv::KeyPoint& kpt, const std::vector<TEvolution>& evolution_);
};
/* ************************************************************************* */
// Inline functions
// Inline functions
void generateDescriptorSubsample(cv::Mat& sampleList, cv::Mat& comparisons,
int nbits, int pattern_size, int nchannels);
float get_angle(float x, float y);
float gaussian(float x, float y, float sigma);
void check_descriptor_limits(int& x, int& y, int width, int height);
int fRound(float flt);
......@@ -55,12 +55,21 @@ namespace cv
KAZE::KAZE()
: extended(false)
, upright(false)
, threshold(0.001f)
, octaves(4)
, sublevels(4)
, diffusivity(DIFF_PM_G2)
{
}
KAZE::KAZE(bool _extended, bool _upright)
KAZE::KAZE(bool _extended, bool _upright, float _threshold, int _octaves,
int _sublevels, int _diffusivity)
: extended(_extended)
, upright(_upright)
, threshold(_threshold)
, octaves(_octaves)
, sublevels(_sublevels)
, diffusivity(_diffusivity)
{
}
......@@ -111,6 +120,10 @@ namespace cv
options.img_height = img.rows;
options.extended = extended;
options.upright = upright;
options.dthreshold = threshold;
options.omax = octaves;
options.nsublevels = sublevels;
options.diffusivity = diffusivity;
KAZEFeatures impl(options);
impl.Create_Nonlinear_Scale_Space(img1_32);
......@@ -180,4 +193,4 @@ namespace cv
CV_Assert((!desc.rows || desc.cols == descriptorSize()));
CV_Assert((!desc.rows || (desc.type() == descriptorType())));
}
}
\ No newline at end of file
}
......@@ -5,7 +5,8 @@
* @author Pablo F. Alcantarilla, Jesus Nuevo
*/
#pragma once
#ifndef __OPENCV_FEATURES_2D_AKAZE_CONFIG_H__
#define __OPENCV_FEATURES_2D_AKAZE_CONFIG_H__
/* ************************************************************************* */
// OpenCV
......@@ -28,14 +29,6 @@ const float gauss25[7][7] = {
/// AKAZE configuration options structure
struct AKAZEOptions {
/// AKAZE Diffusivities
enum DIFFUSIVITY_TYPE {
PM_G1 = 0,
PM_G2 = 1,
WEICKERT = 2,
CHARBONNIER = 3
};
AKAZEOptions()
: omax(4)
, nsublevels(4)
......@@ -44,12 +37,12 @@ struct AKAZEOptions {
, soffset(1.6f)
, derivative_factor(1.5f)
, sderivatives(1.0)
, diffusivity(PM_G2)
, diffusivity(cv::DIFF_PM_G2)
, dthreshold(0.001f)
, min_dthreshold(0.00001f)
, descriptor(cv::AKAZE::DESCRIPTOR_MLDB)
, descriptor(cv::DESCRIPTOR_MLDB)
, descriptor_size(0)
, descriptor_channels(3)
, descriptor_pattern_size(10)
......@@ -67,12 +60,12 @@ struct AKAZEOptions {
float soffset; ///< Base scale offset (sigma units)
float derivative_factor; ///< Factor for the multiscale derivatives
float sderivatives; ///< Smoothing factor for the derivatives
DIFFUSIVITY_TYPE diffusivity; ///< Diffusivity type
int diffusivity; ///< Diffusivity type
float dthreshold; ///< Detector response threshold to accept point
float min_dthreshold; ///< Minimum detector threshold to accept a point
cv::AKAZE::DESCRIPTOR_TYPE descriptor; ///< Type of descriptor
int descriptor; ///< Type of descriptor
int descriptor_size; ///< Size of the descriptor in bits. 0->Full size
int descriptor_channels; ///< Number of channels in the descriptor (1, 2, 3)
int descriptor_pattern_size; ///< Actual patch size is 2*pattern_size*point.scale
......@@ -82,28 +75,4 @@ struct AKAZEOptions {
int kcontrast_nbins; ///< Number of bins for the contrast factor histogram
};
/* ************************************************************************* */
/// AKAZE nonlinear diffusion filtering evolution
struct TEvolution {
TEvolution() {
etime = 0.0f;
esigma = 0.0f;
octave = 0;
sublevel = 0;
sigma_size = 0;
}
cv::Mat Lx, Ly; // First order spatial derivatives
cv::Mat Lxx, Lxy, Lyy; // Second order spatial derivatives
cv::Mat Lflow; // Diffusivity image
cv::Mat Lt; // Evolution image
cv::Mat Lsmooth; // Smoothed image
cv::Mat Lstep; // Evolution step update
cv::Mat Ldet; // Detector response
float etime; // Evolution time
float esigma; // Evolution sigma. For linear diffusion t = sigma^2 / 2
size_t octave; // Image octave
size_t sublevel; // Image sublevel in each octave
size_t sigma_size; // Integer sigma. For computing the feature detector responses
};
\ No newline at end of file
#endif
此差异已折叠。
/**
* @file AKAZE.h
* @brief Main class for detecting and computing binary descriptors in an
* accelerated nonlinear scale space
* @date Mar 27, 2013
* @author Pablo F. Alcantarilla, Jesus Nuevo
*/
#ifndef __OPENCV_FEATURES_2D_AKAZE_FEATURES_H__
#define __OPENCV_FEATURES_2D_AKAZE_FEATURES_H__
/* ************************************************************************* */
// Includes
#include "precomp.hpp"
#include "AKAZEConfig.h"
#include "TEvolution.h"
/* ************************************************************************* */
// AKAZE Class Declaration
class AKAZEFeatures {
private:
AKAZEOptions options_; ///< Configuration options for AKAZE
std::vector<TEvolution> evolution_; ///< Vector of nonlinear diffusion evolution
/// FED parameters
int ncycles_; ///< Number of cycles
bool reordering_; ///< Flag for reordering time steps
std::vector<std::vector<float > > tsteps_; ///< Vector of FED dynamic time steps
std::vector<int> nsteps_; ///< Vector of number of steps per cycle
/// Matrices for the M-LDB descriptor computation
cv::Mat descriptorSamples_; // List of positions in the grids to sample LDB bits from.
cv::Mat descriptorBits_;
cv::Mat bitMask_;
public:
/// Constructor with input arguments
AKAZEFeatures(const AKAZEOptions& options);
/// Scale Space methods
void Allocate_Memory_Evolution();
int Create_Nonlinear_Scale_Space(const cv::Mat& img);
void Feature_Detection(std::vector<cv::KeyPoint>& kpts);
void Compute_Determinant_Hessian_Response(void);
void Compute_Multiscale_Derivatives(void);
void Find_Scale_Space_Extrema(std::vector<cv::KeyPoint>& kpts);
void Do_Subpixel_Refinement(std::vector<cv::KeyPoint>& kpts);
/// Feature description methods
void Compute_Descriptors(std::vector<cv::KeyPoint>& kpts, cv::Mat& desc);
static void Compute_Main_Orientation(cv::KeyPoint& kpt, const std::vector<TEvolution>& evolution_);
};
/* ************************************************************************* */
/// Inline functions
void generateDescriptorSubsample(cv::Mat& sampleList, cv::Mat& comparisons,
int nbits, int pattern_size, int nchannels);
#endif
......@@ -5,7 +5,8 @@
* @author Pablo F. Alcantarilla
*/
#pragma once
#ifndef __OPENCV_FEATURES_2D_AKAZE_CONFIG_H__
#define __OPENCV_FEATURES_2D_AKAZE_CONFIG_H__
// OpenCV Includes
#include "precomp.hpp"
......@@ -15,14 +16,8 @@
struct KAZEOptions {
enum DIFFUSIVITY_TYPE {
PM_G1 = 0,
PM_G2 = 1,
WEICKERT = 2
};
KAZEOptions()
: diffusivity(PM_G2)
: diffusivity(cv::DIFF_PM_G2)
, soffset(1.60f)
, omax(4)
......@@ -33,20 +28,13 @@ struct KAZEOptions {
, dthreshold(0.001f)
, kcontrast(0.01f)
, kcontrast_percentille(0.7f)
, kcontrast_bins(300)
, use_fed(true)
, kcontrast_bins(300)
, upright(false)
, extended(false)
, use_clipping_normalilzation(false)
, clipping_normalization_ratio(1.6f)
, clipping_normalization_niter(5)
{
}
DIFFUSIVITY_TYPE diffusivity;
int diffusivity;
float soffset;
int omax;
int nsublevels;
......@@ -57,27 +45,8 @@ struct KAZEOptions {
float kcontrast;
float kcontrast_percentille;
int kcontrast_bins;
bool use_fed;
bool upright;
bool extended;
bool use_clipping_normalilzation;
float clipping_normalization_ratio;
int clipping_normalization_niter;
};
struct TEvolution {
cv::Mat Lx, Ly; // First order spatial derivatives
cv::Mat Lxx, Lxy, Lyy; // Second order spatial derivatives
cv::Mat Lflow; // Diffusivity image
cv::Mat Lt; // Evolution image
cv::Mat Lsmooth; // Smoothed image
cv::Mat Lstep; // Evolution step update
cv::Mat Ldet; // Detector response
float etime; // Evolution time
float esigma; // Evolution sigma. For linear diffusion t = sigma^2 / 2
float octave; // Image octave
float sublevel; // Image sublevel in each octave
int sigma_size; // Integer esigma. For computing the feature detector responses
};
#endif
......@@ -7,84 +7,53 @@
* @author Pablo F. Alcantarilla
*/
#ifndef KAZE_H_
#define KAZE_H_
//*************************************************************************************
//*************************************************************************************
#ifndef __OPENCV_FEATURES_2D_KAZE_FEATURES_H__
#define __OPENCV_FEATURES_2D_KAZE_FEATURES_H__
/* ************************************************************************* */
// Includes
#include "KAZEConfig.h"
#include "nldiffusion_functions.h"
#include "fed.h"
#include "TEvolution.h"
//*************************************************************************************
//*************************************************************************************
/* ************************************************************************* */
// KAZE Class Declaration
class KAZEFeatures {
private:
KAZEOptions options;
// Parameters of the Nonlinear diffusion class
std::vector<TEvolution> evolution_; // Vector of nonlinear diffusion evolution
/// Parameters of the Nonlinear diffusion class
KAZEOptions options_; ///< Configuration options for KAZE
std::vector<TEvolution> evolution_; ///< Vector of nonlinear diffusion evolution
// Vector of keypoint vectors for finding extrema in multiple threads
/// Vector of keypoint vectors for finding extrema in multiple threads
std::vector<std::vector<cv::KeyPoint> > kpts_par_;
// FED parameters
int ncycles_; // Number of cycles
bool reordering_; // Flag for reordering time steps
std::vector<std::vector<float > > tsteps_; // Vector of FED dynamic time steps
std::vector<int> nsteps_; // Vector of number of steps per cycle
// Some auxiliary variables used in the AOS step
cv::Mat Ltx_, Lty_, px_, py_, ax_, ay_, bx_, by_, qr_, qc_;
/// FED parameters
int ncycles_; ///< Number of cycles
bool reordering_; ///< Flag for reordering time steps
std::vector<std::vector<float > > tsteps_; ///< Vector of FED dynamic time steps
std::vector<int> nsteps_; ///< Vector of number of steps per cycle
public:
// Constructor
/// Constructor
KAZEFeatures(KAZEOptions& options);
// Public methods for KAZE interface
/// Public methods for KAZE interface
void Allocate_Memory_Evolution(void);
int Create_Nonlinear_Scale_Space(const cv::Mat& img);
void Feature_Detection(std::vector<cv::KeyPoint>& kpts);
void Feature_Description(std::vector<cv::KeyPoint>& kpts, cv::Mat& desc);
static void Compute_Main_Orientation(cv::KeyPoint& kpt, const std::vector<TEvolution>& evolution_, const KAZEOptions& options);
private:
// Feature Detection Methods
/// Feature Detection Methods
void Compute_KContrast(const cv::Mat& img, const float& kper);
void Compute_Multiscale_Derivatives(void);
void Compute_Detector_Response(void);
void Determinant_Hessian_Parallel(std::vector<cv::KeyPoint>& kpts);
void Find_Extremum_Threading(const int& level);
void Determinant_Hessian(std::vector<cv::KeyPoint>& kpts);
void Do_Subpixel_Refinement(std::vector<cv::KeyPoint>& kpts);
// AOS Methods
void AOS_Step_Scalar(cv::Mat &Ld, const cv::Mat &Ldprev, const cv::Mat &c, const float& stepsize);
void AOS_Rows(const cv::Mat &Ldprev, const cv::Mat &c, const float& stepsize);
void AOS_Columns(const cv::Mat &Ldprev, const cv::Mat &c, const float& stepsize);
void Thomas(const cv::Mat &a, const cv::Mat &b, const cv::Mat &Ld, cv::Mat &x);
};
//*************************************************************************************
//*************************************************************************************
// Inline functions
float getAngle(const float& x, const float& y);
float gaussian(const float& x, const float& y, const float& sig);
void checkDescriptorLimits(int &x, int &y, const int& width, const int& height);
void clippingDescriptor(float *desc, const int& dsize, const int& niter, const float& ratio);
int fRound(const float& flt);
//*************************************************************************************
//*************************************************************************************
#endif // KAZE_H_
#endif
/**
* @file TEvolution.h
* @brief Header file with the declaration of the TEvolution struct
* @date Jun 02, 2014
* @author Pablo F. Alcantarilla
*/
#ifndef __OPENCV_FEATURES_2D_TEVOLUTION_H__
#define __OPENCV_FEATURES_2D_TEVOLUTION_H__
/* ************************************************************************* */
/// KAZE/A-KAZE nonlinear diffusion filtering evolution
struct TEvolution {
TEvolution() {
etime = 0.0f;
esigma = 0.0f;
octave = 0;
sublevel = 0;
sigma_size = 0;
}
cv::Mat Lx, Ly; ///< First order spatial derivatives
cv::Mat Lxx, Lxy, Lyy; ///< Second order spatial derivatives
cv::Mat Lt; ///< Evolution image
cv::Mat Lsmooth; ///< Smoothed image
cv::Mat Ldet; ///< Detector response
float etime; ///< Evolution time
float esigma; ///< Evolution sigma. For linear diffusion t = sigma^2 / 2
int octave; ///< Image octave
int sublevel; ///< Image sublevel in each octave
int sigma_size; ///< Integer esigma. For computing the feature detector responses
};
#endif
#ifndef FED_H
#define FED_H
#ifndef __OPENCV_FEATURES_2D_FED_H__
#define __OPENCV_FEATURES_2D_FED_H__
//******************************************************************************
//******************************************************************************
......@@ -22,4 +22,4 @@ bool fed_is_prime_internal(const int& number);
//*************************************************************************************
//*************************************************************************************
#endif // FED_H
#endif // __OPENCV_FEATURES_2D_FED_H__
......@@ -8,8 +8,8 @@
* @author Pablo F. Alcantarilla
*/
#ifndef KAZE_NLDIFFUSION_FUNCTIONS_H
#define KAZE_NLDIFFUSION_FUNCTIONS_H
#ifndef __OPENCV_FEATURES_2D_NLDIFFUSION_FUNCTIONS_H__
#define __OPENCV_FEATURES_2D_NLDIFFUSION_FUNCTIONS_H__
/* ************************************************************************* */
// Includes
......
#ifndef __OPENCV_FEATURES_2D_KAZE_UTILS_H__
#define __OPENCV_FEATURES_2D_KAZE_UTILS_H__
/* ************************************************************************* */
/**
* @brief This function computes the angle from the vector given by (X Y). From 0 to 2*Pi
*/
inline float getAngle(float x, float y) {
if (x >= 0 && y >= 0) {
return atanf(y / x);
}
if (x < 0 && y >= 0) {
return static_cast<float>(CV_PI)-atanf(-y / x);
}
if (x < 0 && y < 0) {
return static_cast<float>(CV_PI)+atanf(y / x);
}
if (x >= 0 && y < 0) {
return static_cast<float>(2.0 * CV_PI) - atanf(-y / x);
}
return 0;
}
/* ************************************************************************* */
/**
* @brief This function computes the value of a 2D Gaussian function
* @param x X Position
* @param y Y Position
* @param sig Standard Deviation
*/
inline float gaussian(float x, float y, float sigma) {
return expf(-(x*x + y*y) / (2.0f*sigma*sigma));
}
/* ************************************************************************* */
/**
* @brief This function checks descriptor limits
* @param x X Position
* @param y Y Position
* @param width Image width
* @param height Image height
*/
inline void checkDescriptorLimits(int &x, int &y, int width, int height) {
if (x < 0) {
x = 0;
}
if (y < 0) {
y = 0;
}
if (x > width - 1) {
x = width - 1;
}
if (y > height - 1) {
y = height - 1;
}
}
/* ************************************************************************* */
/**
* @brief This funtion rounds float to nearest integer
* @param flt Input float
* @return dst Nearest integer
*/
inline int fRound(float flt) {
return (int)(flt + 0.5f);
}
#endif
......@@ -101,8 +101,14 @@ public:
typedef typename Distance::ResultType DistanceType;
CV_DescriptorExtractorTest( const string _name, DistanceType _maxDist, const Ptr<DescriptorExtractor>& _dextractor,
Distance d = Distance() ):
name(_name), maxDist(_maxDist), dextractor(_dextractor), distance(d) {}
Distance d = Distance(), Ptr<FeatureDetector> _detector = Ptr<FeatureDetector>()):
name(_name), maxDist(_maxDist), dextractor(_dextractor), distance(d) , detector(_detector) {}
~CV_DescriptorExtractorTest()
{
if(!detector.empty())
detector.release();
}
protected:
virtual void createDescriptorExtractor() {}
......@@ -189,7 +195,6 @@ protected:
// Read the test image.
string imgFilename = string(ts->get_data_path()) + FEATURES2D_DIR + "/" + IMAGE_FILENAME;
Mat img = imread( imgFilename );
if( img.empty() )
{
......@@ -197,13 +202,15 @@ protected:
ts->set_failed_test_info( cvtest::TS::FAIL_INVALID_TEST_DATA );
return;
}
vector<KeyPoint> keypoints;
FileStorage fs( string(ts->get_data_path()) + FEATURES2D_DIR + "/keypoints.xml.gz", FileStorage::READ );
if( fs.isOpened() )
{
if(!detector.empty()) {
detector->detect(img, keypoints);
} else {
read( fs.getFirstTopLevelNode(), keypoints );
}
if(!keypoints.empty())
{
Mat calcDescriptors;
double t = (double)getTickCount();
dextractor->compute( img, keypoints, calcDescriptors );
......@@ -244,7 +251,7 @@ protected:
}
}
}
else
if(!fs.isOpened())
{
ts->printf( cvtest::TS::LOG, "Compute and write keypoints.\n" );
fs.open( string(ts->get_data_path()) + FEATURES2D_DIR + "/keypoints.xml.gz", FileStorage::WRITE );
......@@ -295,6 +302,7 @@ protected:
const DistanceType maxDist;
Ptr<DescriptorExtractor> dextractor;
Distance distance;
Ptr<FeatureDetector> detector;
private:
CV_DescriptorExtractorTest& operator=(const CV_DescriptorExtractorTest&) { return *this; }
......@@ -340,3 +348,19 @@ TEST( Features2d_DescriptorExtractor_OpponentBRIEF, regression )
DescriptorExtractor::create("OpponentBRIEF") );
test.safe_run();
}
TEST( Features2d_DescriptorExtractor_KAZE, regression )
{
CV_DescriptorExtractorTest< L2<float> > test( "descriptor-kaze", 0.03f,
DescriptorExtractor::create("KAZE"),
L2<float>(), FeatureDetector::create("KAZE"));
test.safe_run();
}
TEST( Features2d_DescriptorExtractor_AKAZE, regression )
{
CV_DescriptorExtractorTest<Hamming> test( "descriptor-akaze", (CV_DescriptorExtractorTest<Hamming>::DistanceType)12.f,
DescriptorExtractor::create("AKAZE"),
Hamming(), FeatureDetector::create("AKAZE"));
test.safe_run();
}
......@@ -289,6 +289,18 @@ TEST( Features2d_Detector_ORB, regression )
test.safe_run();
}
TEST( Features2d_Detector_KAZE, regression )
{
CV_FeatureDetectorTest test( "detector-kaze", FeatureDetector::create("KAZE") );
test.safe_run();
}
TEST( Features2d_Detector_AKAZE, regression )
{
CV_FeatureDetectorTest test( "detector-akaze", FeatureDetector::create("AKAZE") );
test.safe_run();
}
TEST( Features2d_Detector_GridFAST, regression )
{
CV_FeatureDetectorTest test( "detector-grid-fast", FeatureDetector::create("GridFAST") );
......
......@@ -167,19 +167,17 @@ TEST(Features2d_Detector_Keypoints_Dense, validation)
test.safe_run();
}
// FIXIT #2807 Crash on Windows 7 x64 MSVS 2012, Linux Fedora 19 x64 with GCC 4.8.2, Linux Ubuntu 14.04 LTS x64 with GCC 4.8.2
TEST(Features2d_Detector_Keypoints_KAZE, DISABLED_validation)
TEST(Features2d_Detector_Keypoints_KAZE, validation)
{
CV_FeatureDetectorKeypointsTest test(Algorithm::create<FeatureDetector>("Feature2D.KAZE"));
test.safe_run();
}
// FIXIT #2807 Crash on Windows 7 x64 MSVS 2012, Linux Fedora 19 x64 with GCC 4.8.2, Linux Ubuntu 14.04 LTS x64 with GCC 4.8.2
TEST(Features2d_Detector_Keypoints_AKAZE, DISABLED_validation)
TEST(Features2d_Detector_Keypoints_AKAZE, validation)
{
CV_FeatureDetectorKeypointsTest test_kaze(cv::Ptr<FeatureDetector>(new cv::AKAZE(cv::AKAZE::DESCRIPTOR_KAZE)));
CV_FeatureDetectorKeypointsTest test_kaze(cv::Ptr<FeatureDetector>(new cv::AKAZE(cv::DESCRIPTOR_KAZE)));
test_kaze.safe_run();
CV_FeatureDetectorKeypointsTest test_mldb(cv::Ptr<FeatureDetector>(new cv::AKAZE(cv::AKAZE::DESCRIPTOR_MLDB)));
CV_FeatureDetectorKeypointsTest test_mldb(cv::Ptr<FeatureDetector>(new cv::AKAZE(cv::DESCRIPTOR_MLDB)));
test_mldb.safe_run();
}
......@@ -652,8 +652,7 @@ TEST(Features2d_ScaleInvariance_Detector_BRISK, regression)
test.safe_run();
}
// FIXIT #2807 Crash on Windows 7 x64 MSVS 2012, Linux Fedora 19 x64 with GCC 4.8.2, Linux Ubuntu 14.04 LTS x64 with GCC 4.8.2
TEST(Features2d_ScaleInvariance_Detector_KAZE, DISABLED_regression)
TEST(Features2d_ScaleInvariance_Detector_KAZE, regression)
{
DetectorScaleInvarianceTest test(Algorithm::create<FeatureDetector>("Feature2D.KAZE"),
0.08f,
......@@ -661,8 +660,7 @@ TEST(Features2d_ScaleInvariance_Detector_KAZE, DISABLED_regression)
test.safe_run();
}
// FIXIT #2807 Crash on Windows 7 x64 MSVS 2012, Linux Fedora 19 x64 with GCC 4.8.2, Linux Ubuntu 14.04 LTS x64 with GCC 4.8.2
TEST(Features2d_ScaleInvariance_Detector_AKAZE, DISABLED_regression)
TEST(Features2d_ScaleInvariance_Detector_AKAZE, regression)
{
DetectorScaleInvarianceTest test(Algorithm::create<FeatureDetector>("Feature2D.AKAZE"),
0.08f,
......
<?xml version="1.0"?>
<opencv_storage>
<H13 type_id="opencv-matrix">
<rows>3</rows>
<cols>3</cols>
<dt>d</dt>
<data>
7.6285898e-01 -2.9922929e-01 2.2567123e+02
3.3443473e-01 1.0143901e+00 -7.6999973e+01
3.4663091e-04 -1.4364524e-05 1.0000000e+00 </data></H13>
</opencv_storage>
此差异由.gitattributes 抑制。
此差异由.gitattributes 抑制。
此差异已折叠。
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册