Non-stationary processes in gas, liquids, plasmas, multiphase media are recorded with specialised equipment, respectively
starting from direct observation with the eyes, then observation using pinhole
cameras, glass optics [1, 2]. Visualization of flows’ parameters is based
on the physical properties of electromagnetic radiation and its interactions
with media, including physical processes:
• Scattering
• Refraction
• Absorption
• Reflection
• Interference
• Dispersion
• Luminescence
• Emission
From the middle of the 19th century,
photographic equipment began to be used for scientific visualization. French
naturalist J. E. Marey was the first to use multi-frame photography to record
flow movements. His chronophotographic gun (1882) was capable to record 12
consecutive frames per second. In the wind tunnel he visualized flow around
obstacles and the transition to turbulence using a smoke system. Later film
cameras, video cameras, and high-speed drum cameras were used. In the middle of
the 20th century, electro-optical devices appeared. Modern digital high-speed
cameras can record quite fast processes in fluids with rate up to 1 million
frames/s. For capturing films of non-stationary flows the visualisation
equipment should have exposure time less then minimal flow characteristic time;
thus is the time interval between 2 frames.
Obtaining visual
information about flows (fluid dynamic processes) is the creation of
experimental databases for verification of computer fluids dynamics (CFD)
software.
The challange
of the CFD
codes verification is matching of experimental and calculated visualization
data - flow images. 3D and 4D data (animations) are problematic for this
matching. So 2D images extracted from animations should be used. Big number of
consequent images need a special modern approach for being matched.
The use of machine
learning is the
quite promising
approach to
experimental digital animation
analyses in non-stationary fluid dynamics.
The slow motion of liquids may be
recorded with digital cameras at 10000 frames per second with sufficient space
resolution. Modern thermal imaging cameras resolution is considerably lower
than that of optical cameras. The frame rate is less than 1000 frames/s.
Fig. 1 presents thermographic image taken
from video film of turbulent boundary water flow at 115 Hz – submerged impact
hot jet in cold water (false colors). Being recorded through an IR-transparent
window, thermal imager visualizes instant images of infrared radiation from a
thin near-surface water boundary layer [3, 4]. The time dependent curves of the
measured thermal time-resolved signals at four points of flow field are
obtained using FLIR software (Fig. 1).
Fig. 1. A frame of thermographic animation of the impact jet and time
evolution of the temperature at four points (Altair software).
The application of modern
optical high-speed digital cameras allows studying high-speed
non-stationary processes (including supersonic flows) at shooting frequency of
more than a million frames per second. Fig. 2 shows 4 shadowgraph frames of the
water jet formation process: visualization at the top area motion. Time
interval between frames 0.03 ms is enough for speed measurement.
Fig. 2. Shadowgraph of water jet images captured by
high-speed camera (light absorption) 100000 frames/s
The leader speed (measured manually)
increases from 30 to 250 m/s.
Fig. 3 shows 4 consequent shadowgraph
images of 2 shock waves initiated by a nanosecond discharges when recording
with a high-speed digital camera (124 000 frames/s). Cylindrical shock (blast)
wave interacts with plane shock here.
Fig. 3. Shadowgraph images (light refraction) captured by
high-speed camera at 124 000 frames/s
Supersonic gas flow structure can be
visualized by the electrical discharge [5]. Discharge plasma in gas volume
follows shapes of discontinuities, inhomogeneities, streamlines because
discharge electric current strongly depends on the E/N ratio [5, 6]. Fig. 4
presents 9 frames of discharge glow zone: ionization upon interaction of a
shock wave with a pulse volume discharge in the nanosecond range. Images are
taken with high-speed CCD camera connected to PC. Animation of 9 frames is
assembled to study the dynamics of nanosecond-lasting plasma configuration
evolution.
Fig. 4. Images of discharge glow captured by high-speed
camera; 100 ns exposure; 100 ns time interval between frames.
Applications employing machine learning in
geophysics, oil, gas technologies enable today's computers to quickly and
accurately analyze huge amounts of appropriate
data. Large
volumes of disparate kinds of oil and gas data are analyzed by machine learning
algorithms to discover unknown relationships, those that were not identified
previously.
The actual problem of a panoramic digital
experiment in fluid dynamics is animation data analysis that is a big data
analysis problem. In the experimental fluid dynamics today a huge amount of
digital information is accumulated, obtained during video filming with digital
cameras of different types, thermal imagers, etc. The development of digital
technologies leads to a multiple increase in the data array of the gas
dynamical, thermophysical flow fields parameters. The obtained large
arrays of digital data often can not be processed manually. Thus,
modern video films recording the evolution of turbulent fluid flows based on
shadow methods, tracing, thermography require processing and qualified
analysis. The transition to another level of data analysis is predictable in fluid
dynamics.
When working with big data, machine
learning may help analyzing large data arrays (in our case, flow images) [7].
So far, very few papers have been devoted to this problem, but their number is
growing rapidly. Neural networks can effectively capture gas flow structures on
large datasets [8], predict [9] and reconstruct [10] flow development using
Image Retrieval, Template Matching, Parameters Regression, Spatiotemporal
Prediction and other techniques. Deep learning may be used to model high-dimensional
gas-dynamic systems such as turbulence [11]. Image classification and object
detection systems are being developed, for example, for shock wave detection
[8], bow-shock refraction angle tracking [12] or vortex wakes detection and
classification [13].
Different computer vision algorithms are
also applied for processing of digital animations of flows. The most used
methods are edge detection, background image subtraction, noise removal [14].
In the present study the two software
tools were made for flow structures automatic detection and tracking. The
first tool is our in-house code for shock wave detection based on the
modified Canny edge detection and Hough transform algorithms [15, 16]. Edge
detection is used to represent possible shock wave boundaries and the Hough
transform is used to find boundaries close to the straight line. Also, we apply
some line length and angle filters in our code and combining close short lines
into one.
Fig. 5 shows an example
of shadowgraph image processing using our software: oblique shock detection and
automatic angle calculation. The entire video contains several hundred frames.
The oblique shock was created by a small obstacle placed on the bottom wall of
the shock tube channel.
Fig. 5. Automatic image processing: applying edge detection and Hough
transform to detect oblique shock and calculate its angle; first row – source
images; second row – edge and line detection; third row – source image with the
detected oblique shock and calculated angle.
The second software tool was developed
using the convolutional neural network based on the well-known YOLOv2
architecture. The network was trained to detect three classes of objects on the
images: shock waves, convective plumes, and tracer particles. We used up to 800
images for training. Some of them are featured in our online gallery [17]. The
detailed information about the software is given in [15, 16, 18]. Fig. 6 shows
example frames of the post-discharge thermal plume development and its
automatic detection by the neural network. The full animation contains several
thousand images.
Fig. 6. Automatic image processing: thermal plume detection
using convolutional neural network
Also, we
made specific in-house code for bow-shock position tracking based on the pixel
intensity analysis. We processed shadowgraph image sequence with bow-shock wave
near the model in the flow with M ≈ 2. The experiment description is
given in [19]. The image processing algorithm includes brightness averaging
along a given band (Fig. 7, a), mean brightness local extrema detection and
search for the local extrema pattern corresponding to the bow shock based on
the distance between the local extrema on the intensity-distance plane (Fig. 7,
b).
Fig. 7. Bow shock detection based on the light intensity
distribution analysis
Fig. 8 shows sequence of images with
automatic bow shock detection. The recording frame rate was 100 000 frames/s. Vertical line indicates detected bow-shock position. It increases from
frame 1 to 40 and then oscillates near the constant value due to the flow
turbulent pulsations.
Fig. 8. Bow shock tracking image sequence. The number on
images is a frame number
Distance versus time (frame number)
dependency was obtained (see Fig. 9). The distance was measured starting from
the left boundary of the image. It allows us to study bow shock stand-off
distance from the model. The distance strongly depends on the flow velocity and
its turbulent pulsations.
Fig. 9. Bow shock position versus frame number (time)
dependency
Digital scientific animation is one of the
main tools for studying non-stationary flows. Modern high-speed cameras support
video recording at a frame rate of up to 1 000 000 frames/s which
makes it possible to study high-speed processes. Digital animations may match
CFD calculations to validate the CFD code and improve its accuracy. Some
examples one can find in review [20]. Here we presented example of temperature
evolution from thermographic animations (IR radiation emission) obtained at 115
Hz to measure fluid temperature with a high frame rate in different image
points using commercial software. Also, we analyzed high-speed shadowgraph
images (based on refraction) of different high speed gas flows. Supersonic
water jet formation process was recorded at a frame rate of 100 000 frames/s
(example of light absorption). Shadowgraph animations of the shock waves
created by the pulsed sliding discharges were recorded at 124 000 frames/s
(light refraction example). Also, we present 9 sequential images obtained by
the special high-speed CCD camera with the 100 ns delay between frames (plasma
radiation emission). Such short time intervals are suitable for pulsed
electrical discharge visualization, it can be presented as short animation as
well.
Thus, flow animations obtained with different
methods of electromagnetic radiation recording give a lot of digital visual
information about gas dynamics and thermophysics. To study big data
of shock waves evolution (bow shock and oblique shock in the channel, as well
as flow behind the shock initiated by linear discharge – the thermal plume) we
made two software for automatic flow structures detection and tracking, based
on the machine vision and learning techniques for automatic animations
processing.
The first tool is our in-house code
for shock wave detection based on the modified Canny edge detection and Hough
transform algorithms. The second software tool was developed using the
convolutional neural network based on the YOLOv2 architecture.
New quantitative information was obtained
on bow shock position evolution in time interval 3-4 ms and also oblique shock
angle changing was calculated.
This study was carried out within the
framework of the Development Program of the Interdisciplinary Scientific and
Educational School of Moscow State University “Photonic and Quantum
Technologies: Digital Medicine.”
[1]
Emelyanov V. N., Volkov K.
N. Visualization of physical and mathematical modeling data in gas dynamics //
Moscow: Fizmatlit, 360 p., 2018, ISBN: 978-5-9221-1774-6.
[2]
Settles G.
S. Schlieren and Shadowgraph Techniques: Visualizing Phenomena in Transparent
Media // Springer, 2001. ISBN: 978-3-642-56640-0.
[3]
Znamenskaya I. A., Koroteeva E. Yu., Shirshov
Ya. N., Novinskaya A. M., Sysoev N. N. High speed imaging of a supersonic
waterjet flow // Quantitative InfraRed Thermography Journal, Vol. 14, ¹ 2,
2017, pp. 185–192 (doi: 10.1080/17686733.2016.1243749).
[4]
Bolshukhin M. A., Znamenskaya I. A., Fomichev V.
I. A method of quantitative analysis of rapid thermal processes through vessel
walls under nonisothermal liquid flow // Dokl. Phys., Vol. 60, 2015, pp.
524–527 (doi: 10.1134/S1028335815110014).
[5]
Nishio M., Sezaki S.,
Nakamura H. Visualization of flow structure around a hypersonic re-entry
capsule using the electrical discharge method // Journal of Visualization, Vol.
7, 2004, pp. 151–158 (doi: 10.1007/BF03181588).
[6]
Znamenskaya I. A., Koroteev
D. A., Popov N. A. A nanosecond high-current discharge in a supersonic gas flow
// High Temperature, Vol. 43, 2005, pp. 817-824 (doi: 10.1007/s10740-005-0129-x).
[7]
Brunton S. L., Noack B. R.,
Koumoutsakos P. Machine Learning for Fluid Mechanics // Annual Review of Fluid
Mechanics, Vol. 52, 2020, pp. 477-508 (doi:
10.1146/annurev-fluid-010719-060214).
[8]
Monfort M., Luciani T., Komperda J., Ziebart B.,
Mashayek F., Marai G. E. A Deep Learning Approach to Identifying Shock
Locations in Turbulent Combustion Tensor Fields // Modeling, Analysis, and
Visualization of Anisotropy, 2017, pp. 375-392 (doi:
10.1007/978-3-319-61358-1_16).
[9]
Harel R., Rusanovsky M., Fridman Y., Shimony A.
and Oren G. Complete Deep Computer-Vision Methodology for Investigating
Hydrodynamic Instabilities // In: Jagode H., Anzt H., Juckeland G., Ltaief H.
(eds) High Performance Computing. ISC High Performance, 2020. Lecture Notes in
Computer Science, Vol. 12321, 2020, pp. 61-80 (doi:
10.1007/978-3-030-59851-8_5).
[10]
Ott C., Pivot C., Dubois P.,
Gallas Q., Delva J., Lippert M., Keirsbulck L. Pulsed jet phase-averaged flow
field estimation based on neural network approach // Experiments in Fluids,
Vol. 62, ¹ 79, 2021 (doi: 10.1007/s00348-021-03180-0).
[11]
Kutz J. Deep learning in
fluid dynamics, Journal of Fluid Mechanics // Vol. 814, 2017, pp. 1-4 (doi:
10.1017/jfm.2016.803).
[12]
Dehghan Manshadi M.,
Vahdat-Nejad H., Kazemi-Esfeh M. and Alavi M. Speed Detection in Wind-tunnels
by Processing Schlieren Images // IJE TRANSACTIONS A: Basics Vol. 29 ¹ 7, 2016,
pp. 962-967 (doi: 10.5829/idosi.ije.2016.29.07a.11).
[13]
Colvert B., Alsalman M.,
Kanso E. Classifying vortex wakes using neural networks // Bioinspiration &
Biomimetics, Vol. 13, ¹ 2, 2018 (doi: 10.1088/1748-3190/aaa787).
[14]
Li G., Burak Agir M., Kontis
K., Ukai K. and Rengarajan S. Image Processing Techniques for Shock Wave
Detection and Tracking in High Speed Schlieren and Shadowgraph Systems //
Journal of Physics: Conference Series, Vol. 1215, 2019 (doi:
10.1088/1742-6596/1215/1/012021).
[15]
Znamenskaya I. A.,
Doroshchenko I. A. Edge detection and machine learning for automatic flow
structures detection and tracking on schlieren and shadowgraph images // Journal
of Flow Visualization and Image Processing, Vol. 28 ¹ 4, 2021, pp. 1-26 (doi:
10.1615/JFlowVisImageProc.2021037690).
[16]
Znamenskaya I., Doroshchenko
I., Tatarenkova D. Edge Detection and Machine Learning Approach to Identify
Flow Structures on Schlieren and Shadowgraph Images // CEUR Workshop
Proceedings, Vol. 2744, 2020, pp. 1-14 (doi: 10.51130/graphicon-2020-2-3-15).
[17]
Gallery of Photos and
Videos, 2021. URL:
http://molphys.phys.msu.ru/galery.
[18]
Znamenskaya I., Doroshchenko
I., Sysoev N. Edge detection and machine learning application for shadowgraph
and schlieren images analysis // Proceedings of the 19th International
Symposium on Flow Visualization, Shanghai Jiaotong University Press, 2021, pp.
121-130.
[19]
Znamenskaya I. A., Naumov D.
S., Sysoev N. N., Chernikov V. A., Analysis of Dynamic Processes Occurring
during Generation of Plasmoid Formations in a Supersonic Flow // Technical
Physics, Vol. 64, ¹ 6, 2019, pp. 802–806 (doi: 10.1134/S1063784219060252).
[20]
Znamenskaya I. A. Methods
for Panoramic Visualization and Digital Analysis of Thermophysical Flow Fields.
A Review // Scientific Visualization, Vol. 13, ¹ 3, 2021, pp. 125 - 158,
(doi: 10.26583/sv.13.3.13).