VeriVin experiments
VeriVin and Openvino are teaming up to perform a series of experiments using VeriVin's prototype Raman spectrometer.
These experiments will be begin on the 18th of October, 2019, using new installations at Costaflores Organic Vineyard.
Here is a summary of the first three, simultaneous experiments that will be executed:
Do coloured wine bottles protect wine from oxidation, and what is the measurable effect of lightstrike from different types of lighting?
Which bottle closures better promote desirable wine evolution?
Can we create a unique digital fingerprint for a wine, using a spectrometer, and represent this vinoprint on the blockchain as a non-fungible token?
About VeriVin
Experiment Overview
Can we bottle 640 bottles of wine, using four different bottle colours, four different closures, and four different light sources, and perform simultaneous experiments?
Do coloured wine bottles protect the wine from oxidation?
Transparent v Green v Eco v Brown
Initial Veralia samples (Green, ECO, Brown, transparent) available now
Blue bottles not available in Mendoza
Working on Veralia QA, PR people for samples and support
Meeting, visit to factory, with film person
Light variables: Natural light (what is this?) v LED v other artificial light v dark, in the box (control)?
Define what is needed for this - design light chamber
Diurnal cycle? (simulating wine shop?)
This an on-going study…publishing data as we go.
testing frequency (weekly?)
sensing/validating other factors - temperature, light, air quality
sample size
From what I have read in literature, it seems that ‘light strike’ is mainly due to UV and Blue light. However, considering the bottles absorption spectra and the high complexity of the liquid, I wouldn’t be surprised if even green and red played a role. Obviously, intensity of radiation also matters. Also, what part of this light goes trough the bottle also matters (so, the absorption spectrum of the glass).
I think that as long as the temperature of the various enclosures is consistent, there is no need to have a special dark enclosure, a well sealed box is fine
Given that short wavelength light is incriminated, I would go for bulbs with high blue content, so, COLD WHITE LED and generally clod light. Unfortunately, fluorescence lamps that emit this light are halogen, and they also heat heavily the environment. So they are not advisable. I would rather compromise for something like a tube fluorescent light, as white as possible.
Have a look at these links to check if they may work:
Fluorescence:
https://www.lightbulbs.com/product/bulbrite-524053
LED array
LED panel:
I would try to keep illumination more or less constant to each bottle, at about 800/1200 lumen, to simulate the illumination of a supermarket, and/or to about 300/500 lumen to simulate a home environment.
This wikiHow is quite well made in my opinion:
https://www.wikihow.com/Measure-Light-Intensity
Also, we should illuminate all bottles in the same way, and be sure light distribution is uniform, I imagine the best solution is to stack the bottles in layers and illuminate them from the side, using multiple bulbs at a regular distance, is that something possible?
I am thinking that if we first bottle the wines, in the four different bottles, and then scan them with the spectrometer, should we then keep ALL of them in boxes, in darkness for a few months before our next measurement, to rule out the influence of light for any deviation occurring during that period? or do you think it is enough to have the darkness control collection?
I think the control group kept in darkness will be enough. We have had meaningful results with 18 bottles, so 32 are a good number already. I am thinking, though, that wine undergoes quite a lot of changes in the first weeks after bottling, and we may risk to affect the experiment if we irradiate the bottles in a way that is heavily different from what the bottle sees in the cellar. So, I guess you have a point there.
It is probably better to put all the bottles to a “zero point” before starting the proper long term experiment, also this could give us some flexibility in case the experiment is delayed for any reason.
Which closures better promote desirable wine evolution?
natural cork, portocork, synthetic, or screwcap
Details about bottling
Capsule?
We need to bottle them when the spectrometer is available to validate the initial state of each bottle.
Can we create a digital fingerprint for a wine?
how does this fingerprint evolve over time?
post-Fermentation (2020)
1-year in stainless steel
1-year in new oak
multi-year in bottles, stored at controlled temperature.
10000x sample size - cold storage, same light, blockchain-registered temperature
how can this fingerprint be tracked on the blockchain?
start experiment with 2018 and 2019, but start in earnest with 2020 vintage
other wines (10's) available at winery and cold-storage
Shared considerations
Location
new room at winery
costaflores
Staffing for long term testing
Yamil
Carla
Environmental constraints
Temperature
Light
Air quality
wine movement
machine safety
Costs
bottles
closures
bottling
Test staff
critic testing fees
shipping
labeling / packing for critics
spoofing
Funding
Logistics
Cross evaluation
organoleptic
schedule
tasters
chemical
lab1
lab2
lab3
cromatographic
lab1
lab2
lab3
Research
similar experiments and results
positions of critics on numerical rankings
list of caveats
Documentation
Video presentation (i.e. documentary)
Public wiki
Costaflores / VeriVin /
Reporting
On-going study…no need to wait for the findings…we publish continuously?
Promotion / marketing of experiment and results
publication
Sample matrix: Experiment 1 (lightstrike)
Measurements note: (2/12/19) Measurement distance during first VeriVin visit was 13.14 mm .
| RED 3000K | Green | Blue UV | Cold White | Neutral White | Warm White | darkness | total |
Green 1 | 12 | 12 | 12 | 24 | 24 | 24 | 24 | 128 |
Green 2 | 12 | 12 | 12 | 24 | 24 | 24 | 24 | 128 |
Brown | 12 | 12 | 12 | 24 | 24 | 24 | 24 | 128 |
Transparent | 12 | 12 | 12 | 24 | 24 | 24 | 24 | 128 |
total | 48 | 48 | 48 | 96 | 96 | 96 | 96 | 512 |
15 | F | 111111111 | 111111110 | 111111101 | 111111100 | 111111011 | 111111010 | 111111001 | 111111000 | 111110111 | 111110110 | 111110101 | 111110100 | 111110011 | 111110010 | 111110001 | 111110000 | 111101111 | 111101110 | 111101101 | 111101100 | 111101011 | 111101010 | 111101001 | 111101000 | 111100111 | 111100110 | 111100101 | 111100100 | 111100011 | 111100010 | 111100001 | 111100000 |
|
| 511 | 510 | 509 | 508 | 507 | 506 | 505 | 504 | 503 | 502 | 501 | 500 | 499 | 498 | 497 | 496 | 495 | 494 | 493 | 492 | 491 | 490 | 489 | 488 | 487 | 486 | 485 | 484 | 483 | 482 | 481 | 480 |
14 | E | 111011111 | 111011110 | 111011101 | 111011100 | 111011011 | 111011010 | 111011001 | 111011000 | 111010111 | 111010110 | 111010101 | 111010100 | 111010011 | 111010010 | 111010001 | 111010000 | 111001111 | 111001110 | 111001101 | 111001100 | 111001011 | 111001010 | 111001001 | 111001000 | 111000111 | 111000110 | 111000101 | 111000100 | 111000011 | 111000010 | 111000001 | 111000000 |
|
| 479 | 478 | 477 | 476 | 475 | 474 | 473 | 472 | 471 | 470 | 469 | 468 | 467 | 466 | 465 | 464 | 463 | 462 | 461 | 460 | 459 | 458 | 457 | 456 | 455 | 454 | 453 | 452 | 451 | 450 | 449 | 448 |
13 | D | 110111111 | 110111110 | 110111101 | 110111100 | 110111011 | 110111010 | 110111001 | 110111000 | 110110111 | 110110110 | 110110101 | 110110100 | 110110011 | 110110010 | 110110001 | 110110000 | 110101111 | 110101110 | 110101101 | 110101100 | 110101011 | 110101010 | 110101001 | 110101000 | 110100111 | 110100110 | 110100101 | 110100100 | 110100011 | 110100010 | 110100001 | 110100000 |
|
| 447 | 446 | 445 | 444 | 443 | 442 | 441 | 440 | 439 | 438 | 437 | 436 | 435 | 434 | 433 | 432 | 431 | 430 | 429 | 428 | 427 | 426 | 425 | 424 | 423 | 422 | 421 | 420 | 419 | 418 | 417 | 416 |
12 | C | 110011111 | 110011110 | 110011101 | 110011100 | 110011011 | 110011010 | 110011001 | 110011000 | 110010111 | 110010110 | 110010101 | 110010100 | 110010011 | 110010010 | 110010001 | 110010000 | 110001111 | 110001110 | 110001101 | 110001100 | 110001011 | 110001010 | 110001001 | 110001000 | 110000111 | 110000110 | 110000101 | 110000100 | 110000011 | 110000010 | 110000001 | 110000000 |
|
| 415 | 414 | 413 | 412 | 411 | 410 | 409 | 408 | 407 | 406 | 405 | 404 | 403 | 402 | 401 | 400 | 399 | 398 | 397 | 396 | 395 | 394 | 393 | 392 | 391 | 390 | 389 | 388 | 387 | 386 | 385 | 384 |
11 | B | 101111111 | 101111110 | 101111101 | 101111100 | 101111011 | 101111010 | 101111001 | 101111000 | 101110111 | 101110110 | 101110101 | 101110100 | 101110011 | 101110010 | 101110001 | 101110000 | 101101111 | 101101110 | 101101101 | 101101100 | 101101011 | 101101010 | 101101001 | 101101000 | 101100111 | 101100110 | 101100101 | 101100100 | 101100011 | 101100010 | 101100001 | 101100000 |
|
| 383 | 382 | 381 | 380 | 379 | 378 | 377 | 376 | 375 | 374 | 373 | 372 | 371 | 370 | 369 | 368 | 367 | 366 | 365 | 364 | 363 | 362 | 361 | 360 | 359 | 358 | 357 | 356 | 355 | 354 | 353 | 352 |
10 | A | 101011111 | 101011110 | 101011101 | 101011100 | 101011011 | 101011010 | 101011001 | 101011000 | 101010111 | 101010110 | 101010101 | 101010100 | 101010011 | 101010010 | 101010001 | 101010000 | 101001111 | 101001110 | 101001101 | 101001100 | 101001011 | 101001010 | 101001001 | 101001000 | 101000111 | 101000110 | 101000101 | 101000100 | 101000011 | 101000010 | 101000001 | 101000000 |
|
| 351 | 350 | 349 | 348 | 347 | 346 | 345 | 344 | 343 | 342 | 341 | 340 | 339 | 338 | 337 | 336 | 335 | 334 | 333 | 332 | 331 | 330 | 329 | 328 | 327 | 326 | 325 | 324 | 323 | 322 | 321 | 320 |
9 | 9 | 100111111 | 100111110 | 100111101 | 100111100 | 100111011 | 100111010 | 100111001 | 100111000 | 100110111 | 100110110 | 100110101 | 100110100 | 100110011 | 100110010 | 100110001 | 100110000 | 100101111 | 100101110 | 100101101 | 100101100 | 100101011 | 100101010 | 100101001 | 100101000 | 100100111 | 100100110 | 100100101 | 100100100 | 100100011 | 100100010 | 100100001 | 100100000 |
|
| 319 | 318 | 317 | 316 | 315 | 314 | 313 | 312 | 311 | 310 | 309 | 308 | 307 | 306 | 305 | 304 | 303 | 302 | 301 | 300 | 299 | 298 | 297 | 296 | 295 | 294 | 293 | 292 | 291 | 290 | 289 | 288 |
8 | 8 | 100011111 | 100011110 | 100011101 | 100011100 | 100011011 | 100011010 | 100011001 | 100011000 | 100010111 | 100010110 | 100010101 | 100010100 | 100010011 | 100010010 | 100010001 | 100010000 | 100001111 | 100001110 | 100001101 | 100001100 | 100001011 | 100001010 | 100001001 | 100001000 | 100000111 | 100000110 | 100000101 | 100000100 | 100000011 | 100000010 | 100000001 | 100000000 |
|
| 287 | 286 | 285 | 284 | 283 | 282 | 281 | 280 | 279 | 278 | 277 | 276 | 275 | 274 | 273 | 272 | 271 | 270 | 269 | 268 | 267 | 266 | 265 | 264 | 263 | 262 | 261 | 260 | 259 | 258 | 257 | 256 |
7 | 7 | 11111111 | 11111110 | 11111101 | 11111100 | 11111011 | 11111010 | 11111001 | 11111000 | 11110111 | 11110110 | 11110101 | 11110100 | 11110011 | 11110010 | 11110001 | 11110000 | 11101111 | 11101110 | 11101101 | 11101100 | 11101011 | 11101010 | 11101001 | 11101000 | 11100111 | 11100110 | 11100101 | 11100100 | 11100011 | 11100010 | 11100001 | 11100000 |
|
| 255 | 254 | 253 | 252 | 251 | 250 | 249 | 248 | 247 | 246 | 245 | 244 | 243 | 242 | 241 | 240 | 239 | 238 | 237 | 236 | 235 | 234 | 233 | 232 | 231 | 230 | 229 | 228 | 227 | 226 | 225 | 224 |
6 | 6 | 11011111 | 11011110 | 11011101 | 11011100 | 11011011 | 11011010 | 11011001 | 11011000 | 11010111 | 11010110 | 11010101 | 11010100 | 11010011 | 11010010 | 11010001 | 11010000 | 11001111 | 11001110 | 11001101 | 11001100 | 11001011 | 11001010 | 11001001 | 11001000 | 11000111 | 11000110 | 11000101 | 11000100 | 11000011 | 11000010 | 11000001 | 11000000 |
|
| 223 | 222 | 221 | 220 | 219 | 218 | 217 | 216 | 215 | 214 | 213 | 212 | 211 | 210 | 209 | 208 | 207 | 206 | 205 | 204 | 203 | 202 | 201 | 200 | 199 | 198 | 197 | 196 | 195 | 194 | 193 | 192 |
5 | 5 | 10111111 | 10111110 | 10111101 | 10111100 | 10111011 | 10111010 | 10111001 | 10111000 | 10110111 | 10110110 | 10110101 | 10110100 | 10110011 | 10110010 | 10110001 | 10110000 | 10101111 | 10101110 | 10101101 | 10101100 | 10101011 | 10101010 | 10101001 | 10101000 | 10100111 | 10100110 | 10100101 | 10100100 | 10100011 | 10100010 | 10100001 | 10100000 |
�� |
| 191 | 190 | 189 | 188 | 187 | 186 | 185 | 184 | 183 | 182 | 181 | 180 | 179 | 178 | 177 | 176 | 175 | 174 | 173 | 172 | 171 | 170 | 169 | 168 | 167 | 166 | 165 | 164 | 163 | 162 | 161 | 160 |
4 | 4 | 10011111 | 10011110 | 10011101 | 10011100 | 10011011 | 10011010 | 10011001 | 10011000 | 10010111 | 10010110 | 10010101 | 10010100 | 10010011 | 10010010 | 10010001 | 10010000 | 10001111 | 10001110 | 10001101 | 10001100 | 10001011 | 10001010 | 10001001 | 10001000 | 10000111 | 10000110 | 10000101 | 10000100 | 10000011 | 10000010 | 10000001 | 10000000 |
|
| 159 | 158 | 157 | 156 | 155 | 154 | 153 | 152 | 151 | 150 | 149 | 148 | 147 | 146 | 145 | 144 | 143 | 142 | 141 | 140 | 139 | 138 | 137 | 136 | 135 | 134 | 133 | 132 | 131 | 130 | 129 | 128 |
3 | 3 | 1111111 | 1111110 | 1111101 | 1111100 | 1111011 | 1111010 | 1111001 | 1111000 | 1110111 | 1110110 | 1110101 | 1110100 | 1110011 | 1110010 | 1110001 | 1110000 | 1101111 | 1101110 | 1101101 | 1101100 | 1101011 | 1101010 | 1101001 | 1101000 | 1100111 | 1100110 | 1100101 | 1100100 | 1100011 | 1100010 | 1100001 | 1100000 |
|
| 127 | 126 | 125 | 124 | 123 | 122 | 121 | 120 | 119 | 118 | 117 | 116 | 115 | 114 | 113 | 112 | 111 | 110 | 109 | 108 | 107 | 106 | 105 | 104 | 103 | 102 | 101 | 100 | 99 | 98 | 97 | 96 |
2 | 2 | 1011111 | 1011110 | 1011101 | 1011100 | 1011011 | 1011010 | 1011001 | 1011000 | 1010111 | 1010110 | 1010101 | 1010100 | 1010011 | 1010010 | 1010001 | 1010000 | 1001111 | 1001110 | 1001101 | 1001100 | 1001011 | 1001010 | 1001001 | 1001000 | 1000111 | 1000110 | 1000101 | 1000100 | 1000011 | 1000010 | 1000001 | 1000000 |
|
| 95 | 94 | 93 | 92 | 91 | 90 | 89 | 88 | 87 | 86 | 85 | 84 | 83 | 82 | 81 | 80 | 79 | 78 | 77 | 76 | 75 | 74 | 73 | 72 | 71 | 70 | 69 | 68 | 67 | 66 | 65 | 64 |
1 | 1 | 111111 | 111110 | 111101 | 111100 | 111011 | 111010 | 111001 | 111000 | 110111 | 110110 | 110101 | 110100 | 110011 | 110010 | 110001 | 110000 | 101111 | 101110 | 101101 | 101100 | 101011 | 101010 | 101001 | 101000 | 100111 | 100110 | 100101 | 100100 | 100011 | 100010 | 100001 | 100000 |
|
| 63 | 62 | 61 | 60 | 59 | 58 | 57 | 56 | 55 | 54 | 53 | 52 | 51 | 50 | 49 | 48 | 47 | 46 | 45 | 44 | 43 | 42 | 41 | 40 | 39 | 38 | 37 | 36 | 35 | 34 | 33 | 32 |
0 | 0 | 11111 | 11110 | 11101 | 11100 | 11011 | 11010 | 11001 | 11000 | 10111 | 10110 | 10101 | 10100 | 10011 | 10010 | 10001 | 10000 | 1111 | 1110 | 1101 | 1100 | 1011 | 1010 | 1001 | 1000 | 111 | 110 | 101 | 100 | 11 | 10 | 1 | 0 |
|
| 31 | 30 | 29 | 28 | 27 | 26 | 25 | 24 | 23 | 22 | 21 | 20 | 19 | 18 | 17 | 16 | 15 | 14 | 13 | 12 | 11 | 10 | 9 | 8 | 7 | 6 | 5 | 4 | 3 | 2 | 1 | 0 |
15 | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Suyai |
14 | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Giuliana |
13 | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Transparent |
12 | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Red | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Warm | Hoja Seca |
11 | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Suyai |
10 | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Giuliana |
9 | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Transparent |
8 | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Green | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Neutral | Hoja Seca |
7 | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Suyai |
6 | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Giuliana |
5 | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Transparent |
4 | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Blue | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Cold | Hoja Seca |
3 | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Suyai |
2 | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Giuliana |
1 | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Transparent |
0 | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Dark | Hoja Seca |
| 31 | 30 | 29 | 28 | 27 | 26 | 25 | 24 | 23 | 22 | 21 | 20 | 19 | 18 | 17 | 16 | 15 | 14 | 13 | 12 | 11 | 10 | 9 | 8 | 7 | 6 | 5 | 4 | 3 | 2 | 1 | 0 |
|
Regularly, we measure all the bottles, but each trimester, we test 1 bottle of each set from each chamber (16 bottles), through a blind tasting and chemical analysis.
That would provide us with up to 8 years worth of wines to sample, though we would have a diminishing pool of each, after each trimester.
Ideally, we wouldn’t run the experiment more than 5 years.
We may spot some outlier, so when the time comes we will have to decide if keeping monitoring this bottle or opening and test if it is faulted. For this reason it is reasonable to have some extra specimen. I think this is a good number.
Also, on a full day of measurements, we are able to test about 40/45 bottles, this means that we can test the whole lot in out 2 weeks in Argentina.
Don’t forget, we also will have 17,000 bottles of the same wine, bottle at the same time, in one of the bottles, in darkness. So we also have that control to work with.
This is a hugely interesting database, as it sill allow to challenge chemometrics over a large batch!! Definitely worth trying
Also. I would be interested in testing a batch of (let’ say) 32 bottles
Sample matrix: Experiment 2 (closure)
| Bottles |
Natural cork | 32 |
Portocork | 32 |
Tapicork | 32 |
Screwcap | 32 |
total | 128 |
Preliminary testing
VeriVin Through-Bottle Analysis of MTB Wines – A first Test – 17/3/19
CONCLUSIONS
Simply put - MTB wines are classifiable using our Raman probe and chemometrics analysis. We can conduct larger scale experiments that could prove useful to MTB and other wine productions. VeriVin is working on figuring out how far classification can go (vintages, casks, grapes etc.), what the 'resolution' of these differentiations are, and how to mitigate the influence of coloured bottle glass. For significantly different bottles, we can already successfully classify bottles by the combined signal of contents + container. This might be useful in and of itself for identification purposes, but our goal is to classify bottles independently by contents and by container. In other words, we would like to tell you what wine and what bottle it is, independently of one another. This is one of the reasons we are working on mitigating the influence of the glass, which is more significant with coloured glass. The other effect that coloured glass has on our analysis, and one that we are also working to mitigate, is more fundamental - and that is that coloured glass absorbs a large portion of our exciting laser as well as the Raman scattered light collected. Sometimes this absorption is so high that it does not yield a strong enough signal to be useful and hence does not allow us to collect data. Our estimate is that we can currently test about 60% of all bottles, and will be increasing that percentage significantly.
Experiment / Analysis Walkthrough
In this first experiment all measurements have been taken through the same transparent bottle.
Definition of a measurement: Data gathered with VeriVin’s Raman probe
The following data was taken;
- on the 30th of April preliminary measurements of MTB2014-1 through a transparent container
- on the 10th and 13th of May, two sets of measurements of wines MTB2012-1, MTB2014-1, MTB2014-2, MTB2016-1 and MTB2017-1
The first data set is used to create a chemometrics model (PLSDA – Partial least squares discriminant analysis) and the second data set is tested on said model. A data set consists of multiple measurements taken at different times, with different spectral baselines and probing different spots on the test-bottle. We were able to successfully determine which wine was scanned when applying this data to our model – classification successful. -1 and -2 signifies two different bottles, which in the case of MTB2014-1 and -2, had a sensory (taste-able) difference.
Since the final output is a fairly mundane table of classification as shown here, we detail some of the actual analytical components used within the model.
A First Simple Model (figures right)
Even when only displaying two of the 3 latent variables this model uses, the measurements cluster into areas of classification. All data points are correctly classified in the results. Even two different bottles of the same vintage are classifiable. Note that even though MTB2014-1 and -2 are shown in different colours the PLSDA model is given the instruction to class them together – visible in the results table. When classifying by both MTB2014-1 and MTB2014-2 there is a slight confusion between the two. This is expected, however if we were to establish a model to probe the difference between individual bottles from the same batch we would fine tune it and depending on actual physical variability the model would succeed. Here, fine-tuning means that spectra of different classes that are too similar would have to be excluded from the model - although they could be used as test data.
For further clarification; latent variables can be thought of as components, which each of your spectra get broken down into. That means if you sum the right amount of these components, you get back to your original measurement (ideally). How much of these common components each of your measurement (spectrum) contains is displayed in the graphs above. There are more than only three latent variables and depending on the task and size of the model there might be many more. For illustrative purposes a very simple model with only three substances to be classified is shown.
A Spectrum Example
This is what a single data point on these Latent-Variable graphs actually looks like. A spectral Raman response of 512 data points, with a wavelength assigned to each.
Adding more/ similar wines to the model (figures right)
MTB2016 is not as easily distinguishable. Note that the model changes as it is now built with MTB2016 data, meaning that the common components the algorithm searches for also change. A more detailed look into the latent variables is required to visualise the classification. The PLS algorithm in this case looks at five latent variables at the same time but some of the models we use in these small experiments even have up to 8 or more. Of course this cannot be visualised in one graph but it is mathematically necessary to distinguish some test substances. Below an example of how MTB2016 yields similar result to MTB2014 when looking at these LV’s. Looking at multiple LVs (3D right), clearly classifies them separately and enables us to build a model.
Lastly, we apply our test data to this model again and in the results table all of them are classified correctly.
Note that measurements can vary and some are more clearly classified than others. To ensure that one measurement is classified correctly, we can set a threshold, which the measurement has to pass to classify as wine X. An ideal result has measurements of one class above the threshold and all others below, as shown here (note, 2014-1 and -2 as separate classes). Keep in mind the lowest (blue) data point here, could be under the threshold for some measurement and then would be marked as unclassified if not going above the threshold for any class.
Measurements taken two weeks previously applied to the first simple model (no 2016):
The model results table correctly classifies this wine. However, over the time the bottle was open, the wine may have changed, for it appears outside the calibration MTB2014-1 data shown above. There are other experimental reasons for an overtime drift but these were due to alignment changes, which will clearly not occur in the final device.