Progress Report
February 21st, 2006
Overview: Organisational
- "Student Talk" on Monday the 27th (length: 15+5 minutes)
- Form 12 Meeting - a crude draft with full initial content is ready
- Provisional PDF version with an enclosure: Thesis Plan/Structure
- 3 files sent by E-mail prior to this meeting; to be discussed
Overview: Paper-Related
- ISBI registration, preparation of a poster (impending)
- TMI submission
- CVPR rebuttal - a revised rebuttal submitted successfully
- Older version superseded (confirmed by the Organiser)
- MICCAI deadline in 3 weeks - ongoing work on a first draft which introduces the entropy-based approach to assessment
- BMVC and MIUA deadlines in April
Overview: Technical/Experimental Work
- Entropy-based evaluation 'tweaked' to attain better results, i.e. more monotonic curves
- Ongoing discussion on reformulating the entropic measures
- Questions arose with regards to model-based vs. overlap-based weaknesses and strengths
- Generation and processing of large sets containing real brain data
- Towards achieving more plausible results for real and synthetic data
- Aim is to introduce entropy as a surrogate which obviates the need to ever calculate Specificity and Generalisation
Entropy - Issues Previously Observed
- Several experiments demonstrated what was expected but was not desired
- In the mockup case, results depended on how we changed the 'model' distribution for being identical to the 'training' distribution, e.g. expansion (distribution size) or offset (unidirectional shift in a single direction)
- Entropy difference can either get larger or smaller
- Leads to pondering about absolute values
- This would have a negative effect
- even less clear what this means in practice
- simply the incorrect way to proceed; not principled
Entropy - Further Thoughts
- One issue at hand is that which involves distance calculation
- We only consider Euclidean distance in the mockup case
- No valid reason to believe that shuffle will provide a solution
- Matrices with shuffle distance products already used with real data
- Another issue is the discrepancy in the model, which contributes to entropy anomalies
Proposal for Next Steps in Entropy
- We could estimate the constant in the synthetic-training entropy estimate
- Use Monte-Carlo simulation of a flat distribution (as per the paper from Hero's group)
- This would give a normalised entropy-like Specificity measure
- Effectively, this approach recognises that entropy is a relative measure
- One can arbitrarily decide relative to what, a flat distribution being the more obvious choice
Proposal for Next Steps in Entropy - Ctd.
- There are still choices to be made in such simulation
- Choice #1: The size of the hypercube
- One will always be infinitely far from the nearest point in an infinitely wide flat distribution
- The choice is fairly arbitrary
- Can choose anything sensible though ideally we could do with an estimate of the width of the distribution of images
- Not straight-forward since we are only measuring inter example distances
Proposal for Next Steps in Entropy - Ctd.
- Choice #2 The dimensionality of the space
- Real dimensionality of the space in which we are measuring image differences is almost certainly not equal to the number of pixels
- The paper purports to show how to estimate entropy and dimensionality at the same time
ISBI Registration and Preparation
- Camera-ready version submitted
- Latency had no effect whatsoever on decisions (Organiser confirmed this)
- Registration no longer an issue
- Need to prepare a suitable poster
Form 12 Meeting
- According to the Handbook, this meeting needed to be arranged to take place in March
- Due to procrastination in the past (forms overdue), the meeting was scheduled early
- Form 12 sent by E-mail
- Includes a broad, 1-page progress report
- Appendix included, but it is a gross re-use of previous discussions on thesis structure
Monday Talk
- Same presentation as MIAS-IRC 2005
- Talk's duration is the same as in Student Talks
- Two such talks are to take place throughout that meeting
- Talks included in pursuit for a place at the Faculty Showcase
TMI Submission and Commentary
- S. Warfield put down as preferred referee
- Decision by May 1st
- Bill submitted paper which deals with generalised overlap measures (in isolation)
- A glance at the paper we had submitted led to some specific comments being raised
- It also produced general critique and suggestions for subsequent work
Comments on Overlap- and Model-based Assessment
- Interesting discussion points regarding the differences between label overlap measures and model-based measures
- Bill argues:
- "If I have understood the paper correctly, your models are constructed globally over the whole image. Therefore any change in the set of non-rigid registrations performed to generate the correspondences used to construct the models should be reflected in the set of model parameters you obtain..."
Comments on Overlap- and Model-based Assessment - Ctd.
- Further on the topic:
- "...By contrast, the overlap measures will only be sensitive to changes in the positions of edges (or more generally where there are gradients across fractional label voxels) of the included labels. In your experiments you have a small number of fairly large labels which is reflected in the sensitivity of the label overlap measures.
In general the sensitivity of the overlap measures will be a function of the number, shape and spatial distribution of labels contributing to the overall overlap measure.
It's too late for this submission but it would be interesting to test this for any revised version."
Experiments with Entopic Measures
- Further investigation of entropic measures (applied to real brain data)
- Larger sets obtained which provide synthetic-to-synthetic samples
- Still only dealing the first instantiation, so judging by previous results, it should not give monotonic curves
- Repetition of experiments is both possible and plausible when no longer dealing with the problem at a shallow level, but rather delving into a systematic set of predictable experiments
Gamma Experiments
- Attempt to identify good values of Gamma to choose in practice
- The choice of Gamma appears to affect the behaviour of the curves
- Increasing of the value of Gamma possibly improves monotonicity
Altering the Number of Dimensions
- Alpha calculated as follows:
alpha_val=(d-gamma_val)/d
- Thus the number of dimensions (very large in our case) plays a minute role
Smaller Number of Dimensions
![entropy-and-specificity-dimensions-13689-dimensions.png](entropy-and-specificity-dimensions-13689-dimensions.png)
13689 dimensions
Larger Number of Dimensions
![entropy-and-specificity-dimensions-136890-dimensions.png](entropy-and-specificity-dimensions-136890-dimensions.png)
136890 dimensions
The K Nearest Neighbours Approach
- Increasing k in the k-nearest-neighbours approach makes the curves more monotonic
- This is an interesting observation
- The effect of changing k in the mockup case was not prominent
K Nearest Neighbour with 5 Neighbours
K Nearest Neighbour with Various Values of K
Target Conferences (or Ones of Relevance)
- Chronologically-ordered, with deadline appended: