Video Analysis
Proving the video's authenticity and the use of gravitic propulsion.
If you doubt me, you can analyze the video yourself to verify its authenticity and uncover the drone's technology. Despite using AI to quantify and summarize this analysis, you can independently verify all steps using any method of your choice. Once again, I have to use AI given that I'm not a quantum physicist. Regardless, no conventional propulsion systems or sensor artifacts can describe all aspects of this video. You don't have to be a genius to notice the anomalies present within the video. You are welcome to skip this section if you find technical details boring and you already know what this drone is. This section is really just for the physicists, fellow nerds, and skeptics. The video file, an AI analysis chat breaking down the anomalies, the hex data, which is also anomalous and independently verifiable, and the commands and results are at the bottom of the page. This analysis is not 100% perfect, as I am not perfect, and AI is certainly not perfect. What matters the most is my testimony, the authenticity of this video, and Matthew Livelsberger's manifesto. Nearly all of the anomalies I'm about to show you have no explanation that follows the laws of physics or can be explained by normal sensor behavior. By the end of this analysis, you should be confident that this drone is distorting space, time, and light around it. I will also repeat some aspects mentioned in the last section. If you still are not convinced, read the "Who Am I" section for the biological effects of the drone. The video’s authenticity and the technology powering the drone can not be debunked.
Metadata and camera roll
Can't fake this.
Drone pulsing a radial field
Visible radial distortion area surrounding the drone throughout the video
When you increase the luminance of the video, it becomes clear that a radial field distorting light surrounds the drone throughout its flight.
Luminance Suppression Near the drone
This zone consistently shows the lowest luminance in every frame.
RGB values below [30,30,30]
A field extending horizontally around the drone could bend light away, suppressing detected luminance. A gravitic bubble interacting with content near the drone may refract ambient light, acting like a horizontal compression gradient. Such suppression is consistent with gravitational field zones affecting incoming illumination directionally. The dark pixels behind the drone's path perfectly match how a repulsive gravitic propulsion drone pulls in the light from behind it.
Redshift and blueshift zone after adjusted color scopes
Tracks frame-to-frame similarity; maps spatial zones of high disruption.
Further proves the redshift and blueshift zones that the drone causes.
light's path revealed with Color keys
Keying (removing) red, blue, and purple in the video.
Above is the drone video with dark purple removed. When you apply the "Ultra Key" effect in Premiere Pro, a visible yellow aura surrounding the back of the redshift around the drone becomes apparent. Keying out purple removes those components from the image, leaving primarily green and any residual red, which appears as yellow where purple once blended with red (red + green = yellow). This reveals that the purple contained a significant amount of both red and blue, and that green was largely absent from those areas. The presence of a yellow aura behind the drone strongly suggests that the blueshifted light is being absorbed, redirected, or pulled into the drone, rather than simply trailing off into space. The light on the drone that disappears also turns yellow, further proving this phenomenon.
When you key out the blue, the blueshifted light in the drone's trail becomes obvious, further proving the drone compresses and attracts spacetime behind it.
This is when the "Color Key" effect is applied to the red in the video. As you can see, there is a denser concentration of red directly in front of the drone.
Raising luminance reveals the transparent middle of the drone
Further proves transparent nature of the drone and supressed color values.
At the start of the video, the drone is facing away from the camera which is why it appears invisible. The drone starts rotating towards the camera as I zoom. As I mentioned earlier, the drone was transparent in the middle. There's no picture for this because I forgot the Lumetri color settings I used to bring this anomaly out. Either trust me or analyze the video yourself.
True Black Pixels
True black pixels enlarged x5.
True black pixels
Spikes in absolute black pixels (RGB = [0,0,0]) near known field events. Blue pixels above represent true black values.
These values reflect regions where no light reaches the sensor either through extreme lensing (bending light around the object) or direct spacetime occlusion. The second true black value is present in the center of the drone. This is a plausible result of a localized gravitational shadowing effect, similar to what happens around black holes or warp field bubbles, where light paths are redirected away from sensors. As the camera enters the drone's blueshift zone on frame 143, the entire frame is true black. This occurs either from the sensor performing a hard reset due to a high-energy EM field/spacetime distortion zone or due to the other reasons above. The exact reason becomes less significant upon realizing no known propulsion technology causes any of these phenomena, especially when shielded by a roof.
Increasing Luminance reveals hidden color in black pixels orbiting drone
Reveals suppressed color in the raw video. This phenomenon is closest to the drone proving the field is strongest near the drone.
The orbiting pixel region suggests a shear field structure, possibly a gravitational gradient surrounding the drone. The visual orbiting may reflect field-induced plasma or dust trapped in a stable standing wave or light path distortion creating lensing rings that shift color based on depth in the field. This red/blue pattern is consistent with the visible redshift and blueshift in the raw video. Once again, remember that in person there was no visible redshift or blueshift, with the color, shape, and size remaining constant for the entirety of the drone's flight. These "colorized blacks" are not artifacts of the scene; they are sensor-induced responses to field gradients the drone is emitting. Gravitic propulsion produces subtle, coherent distortions in the EM field, especially in CMOS sensor arrays.
universal Luminance flattening and entropy drop as camera enters blueshift zone

Luminance uniformly and signficantly falls in sync with divergence and entropy shifts yet.
Inferno Colormap Legend:The 'inferno' colormap goes from dark purple (representing low intensity) to
yellow-white (representing high intensity). The darker areas indicate
low magnitude (no significant edge), while brighter areas represent areas with
high magnitude (strong edges or distortion).Color Interpretation:Dark Purple/Black: Areas with low luminance change or little to no edge.
These are the regions in the frame where the content is relatively unchanged (low edge detection).Yellow/White: Areas with high luminance change, representing strong edges or distortion.
These could correspond to motion in the frame, sharp boundaries, or high-contrast transitions.Entropy measures randomness in pixel intensity. A drop in entropy implies a loss of distinct details, smoother gradients, and less noise, which can happen either due to sensor saturation or due to the field smoothing local light frequencies and angles. This blurs small details across the image in a subtle, nonlinear way.This is visible even before the full blueshift or sensor breakdown (frame 143), meaning the drone’s field effects reach outward, altering image quality at a distance. Along with the luminance dropping, this suggests an enveloping field that modifies local radiative behavior, suppressing diversity in detected light.
Windowsill light warping
Can not be caused any other propulsion system
Let's revisit this clear example of spacetime field lensing. Once again, the camera remained stationary during this period. The drone stays in the same position, yet the foreground and background light converge toward the drone. As the drone passes overhead, the field pulls on the environment, which is most evident when observing surfaces with reflective or semi-reflective qualities like blinds.
Star overlaying windowsill
Frames 135–138: The light from a distant star overlays and briefly dominates the upward-pulled light from the windowsill.
Light path bending: The drone’s field warps the paths of incoming light, making a background light source (the star) appear in front of a foreground object (the windowsill). This resembles gravitational lensing, where massive objects bend light from background sources. The field may also alter light arrival timing, momentarily intensifying or misplacing photons from the star relative to the windowsill’s light, causing the overlay and subsequent disappearance. This overlay is not explainable by camera error, parallax, or encoding, and aligns with predictions of how gravitic propulsion can distort light trajectories and apparent spatial relationships in a scene.
Light pushing and Pulling As drone passes overhead
"Impossible" Light Distortion
For context, the drone was about 10-20 feet above the roof as it passed overhead. On camera this movement is not captured by the vector field output which is designed to track anomalous pixel movement. This implies a non-motion-based image transformation due to electromagnetic distortion, not actual motion. This movement, along with minimal detectable vector movement shown later in this analysis, implies field warping that isn’t associated with linear movement, such as a spacetime gradient shear or gravitic lensing. High divergence + high intensity + no vector motion + pixel dragging = Localized spacetime distortion.Pixel behavior mimics EM phase distortion, likely a byproduct of field-bending propulsion. The drone manipulates the geometry of light, not just air or inertia. You can also see the light from the wall to the left distorts unevenly. Quick question, when have you ever seen a drone do this?
Drone staying the same size on video as it moves closer to the camera

This was observed by me in person and corroborated through optical flow, pixel drift, and vector field analysis, all of which show motion and distortions in the scene, but not a proportional scaling of the drone as expected in normal perspective.
The point at the end of the 200ft line is the drones location when the video starts. When the sensor performs a hard reset at the 5.5s mark, the drone is flying directly over the apartment. I observed this part with my own eyes. As I mentioned in the messages, the drone was much closer than it appears on video. It is common sense that if you are filming an object as it is getting closer, it should get bigger on camera. Despite the drone travelling around 200 feet from the start to the end of the video, it never gets bigger. Given the drone’s propulsion distorts spacetime, it may be compressing or redirecting light rays around itself (akin to gravitational lensing), causing it to appear a fixed size regardless of distance.The field likely creates a localized warping of geometry, causing light paths from its edges to refract nonlinearly, mimicking a fixed focal length. This observation proves gravitic field influence, as no known inertial propulsion method can maintain a constant scale during forward motion.
Nonlinear Motion Vectors Between the Drone and Star
The star moves independently or oppositely to parallax during the drone’s constant-speed flight.
This is consistent with gravitational lensing. It could also suggest a field interference artifact, with the “star” being a lensing ghost, reflection, or emission caught in field turbulence.
Pixel Drift
Identifies nonlinear drift; massive motion jumps inconsistent with drone velocity or camera mechanics.
Sudden, non-inertial drift patterns suggest local spacetime displacement, where either the drone or its surrounding field temporarily “slips” through the frame without obeying inertial progression. This data cannot be explained by ordinary mechanical drone propulsion or camera shake. The nonlinear and spatially asymmetric distortion patterns, particularly:Directional field influence (light moving upward, redshifting)Repeated extreme drift reversalsLocalized disruption only near the drone/star echoProve the influence of a non-aerodynamic propulsion field. This aligns with theories like inertial dampening or bubble translation, used in Alcubierre warp proposals.
Optical Flow Vector Field Analysis
Shows vector field distortion with asymmetric, localized vector patterns.
Gravitic propulsion creates localized spacetime curvature, affecting how photons (light) and matter move. Optical flow vectors bending or concentrating near the drone reflects gravitational lensing or spacetime warping as light paths are altered by high-mass-equivalent fields. Vector field irregularities are consistent with non-Newtonian motion and inertial cancellation. The first video detects the directional motion, and the second video detects irregular pixel movement, areas where the motion is different from its surrounding area. In the second video, an anomalous pixel motion halo around the drone is present during the redshift period.
Pixel Divergence

If pixels are moving away from each other in an area ➜ positive divergence (expanding).If they're moving toward each other ➜ negative divergence (contracting).
Divergence measures how much the optical flow vectors radiate out from or converge into a region. In other words, local expansion or compression of pixel motion. Large positive values indicate spreading motion, while negative would indicate collapsing motion. The divergence values support a horizontal field that declines vertically. Divergence maps do not align precisely with the actual star distortion, appearing larger, shifted, and mismatched. That discrepancy is expected if the optical field affecting light is nonlinear or involves localized lensing, refraction, or gravitational-like warping. Pixel divergence shows invisible forces at work; areas around the drone or star are distorting motion even when the visible object doesn’t match that disturbance directly. This data strongly supports the presence of a dynamic, fluctuating energy field that influences the environment beyond the visible object and produces nonlinear and oscillating divergence patterns. This also correlates with observed pixel drift and entropy shifts.The data has bursts of strong divergence, followed by suppressed periods, a behavior consistent with gravitational or inertial field modulation, not conventional jet or rotor propulsion.It supports the notion of a non-visual force field altering pixel movement patterns. Or it's evidence of standard computer vision metrics starting to break down in the presence of extreme velocity, relativistic effects, or sensor interference. Either way, it's a gravitic propulsion drone, so does it truly matter? Honestly, how much more proof do you need?
Pixel Speed Grid Comparison
Revealed that the drone’s grid consistently had the highest pixel speed (change rate) during the redshift phase, during a “phase shift,” the “star’s” region became dominant, and the speed changes vertically.
As the camera was in the drone's blueshift zone, the top third of the frame constantly had the lowest pixel speed. As the redshift activates, this zone becomes dominant. The box under the drone constantly has the slowest speed. This is analogous to how a blueshift zone slows down time and a redshift zone speeds up time. During the period of the windowsill's light warping, the speed is the highest in the top and middle third, proving that this effect is strongest near the drone. This suggests a field interaction or energy redistribution, possibly a field effect where the propulsion field transfers momentum or inertia. The drop in the drone’s relative motion compared to the “star” may reflect a momentary field shedding or stabilization burst. Also, the pixel speed during the distortion period is so anomalous that Premiere Pro's optical flow algorithm can not effectively determine the motion. As I mentioned earlier, analyze the video at around 20% speed and apply optical flow to observe this phenomenon.
Pixel Speed & blueshift/redshift zone Correlation

Calculating pixel speed to determine how rapidly the image changes over time in different regions of the video.
Once again, the drone starts rotating towards the camera as I am zooming in from frames 35-57.
The Bottom section shows a dramatic increase in pixel speed during this phase (rising from ~6 to over 13).As the drone rotates, its gravitic field begins to align its propulsion axis partially toward the viewer.
This intensifies gravitational lensing or compression in the region beneath it (closer to background/ground structures), causing severe local pixel acceleration.🟩 Middle Section Slight Decline
Middle pixel speed dips slightly, indicating less motion directly through the core, possibly because the axis is tilting upward, exposing more of the field edges than the engine core.🟦 Top Section Remains Low
The top band remains subdued in motion, consistent with most of the gravitational compression still occurring below the drone.All of this is consistent with a gravitic propulsion mechanism, manipulating the surrounding spacetime geometry rather than simply moving through it, and creates non-intuitive pixel motion, non-linear optical flow, and spectral distortion. The spike at the bottom of the frame before frame 140 correlates with the drone pulling the light from the windowsill towards it.
Sensor Noise
Sensor noise boxes ranked 1-9 through redshift phase
Along with the highest pixel speed, the drones box consistently shows the highest sensor noise. The drone appears to be creating anisotropic field effects. It pulls background light around itself but visually anchors its own body with minimal displacement in the frame. This was proven when the drone stayed still on screen while the star drifted. This is not inertial motion.
Divergence + Luminance + Black Pixel Overlay

Detects overlapping spikes in divergence (motion spread), luminance drops, and black pixel surges during key anomaly frames.
Divergence spikes indicate sudden expansion or collapse of local optical vectors, suggesting field inflation or deflation zones. Simultaneous luminance loss and black pixel saturation indicate localized light suppression, potentially from high gravitational field density absorbing or redirecting light. This is a key signature of mass-energy field interaction altering the local EM spectrum.
Peak Signal to noise ratio (PSNR)
This ratio is used as a quality measurement between the original and a compressed image. The higher the PSNR, the better the quality of the compressed, or reconstructed image.
There is a clear correlation between PSNR drops and distortion phases.
⚠️ Frames 122–124 (Phase Shift 3, Drone Echo/Glitch)
0122 → 0123: 28.06 dB0123 → 0124: 30.75 dBClear sign of heavy visual transformation, exactly where the accordion-like glitch and horizontal stretching occur.Frames 053–057 (Phase Shift 2 conclusion)
Drop below ~35 dB range (often around 33–34 dB), confirming the abrupt pixel disruption as the second phase shift finishes.⚠️ Frame 121 → 122
Sharp PSNR drop to 29.17 dB, marking the beginning of Shift 3 as the drone reorients upward and elongates.
Entropy Mapping
Sharp entropy collapse before and during known phase events (especially frames 50–60, 140–160), dropping to near zero at peak distortions.
Entropy collapse reflects order increase or signal suppression. A gravitic field may dampen sensor entropy by reducing random photon distribution, as field gradients act as information sinks. Entropy troughs are expected in high-curvature zones where local spacetime acts more like a "well" than a random medium. Every major drop or spike in entropy coincides with physical drift, luminance crashes, black pixel saturation, and SSIM flattening.This supports the conclusion of a field-based interference mechanism.
Structural Similarity Index Measure (SSIm)
Tracks frame-to-frame similarity; maps spatial zones of high disruption.
Heatmap Legend

Full Heatmap
Cyan/Green Values
Red/Yellow Values
Reveals consistent low-similarity zones near the drone and moving distortions sweeping from the bottom right to the top left, correlating with the windowsill's light distortion angle and drone proximity. Cyan/green zones pull toward the drone while the redshift is active, while the horizontal area near the drone is primarily red/yellow. This pattern mimics spacetime gradient convergence, where nearby photons and visual data distort as they approach a gravitational field well. The color-matched disruption traveling toward the drone mirrors what you'd expect in gravitational pull patterns on both EM radiation and local objects. The similarity map shows real-world background light bending, consistent with Einstein lensing.
fast fourier transform (FFT)

FFT anomalies
Dips in values might represent moments where light is either absorbed, refracted beyond the sensor’s view, or “flattened” via gravitational compression. It’s analogous to gravitational time dilation altering emission frequency and intensity from a source entering a field. Post sensor blackout as the scene visually stabilizes, total and mean magnitudes stay lower than pre-blackout levels for some time, implying altered spatial structure or degraded fidelity, possibly due to residual sensor noise or distortion from the drone. This is sensor-level or energy-field related, not natural motion blur or focus shift. The persistent high-band dominance in some areas is not typical for consumer optics.
1. Sudden Drop in HighBand Power (Frames 015–022)
Drop: 2.56B → 1.71B in HighBandGravitic Propulsion Reasoning:This 15-22 frame window aligns with the first "phase shift."A gravitic field warping local spacetime may compress or diffuse high-frequency light details.Result: the sensor sees blurred, stretched, or smeared edges, causing loss of high-frequency content, matching the drop in HighBand power.Abrupt Dips in LowBand Values (Frames 050–057)
Drop: 995M → 473M (~50%)Gravitic Reasoning:Low-band energy is tied to broad luminance gradients and structural layout.This period matches the second major shift where the drone and star distort in sync. This is not explainable by motion blur or lighting alone.Very Low Values in All Bands (Frames 078–089)
The stretching artifact in the drone's image on Frame 78 suggests spatial distortion, especially affecting the mid and high frequencies. Immediately following this, all bands drop.Gravitic Reasoning:This phase occurs after the second sensor collapse and matches an inertial null period where the drone appears to stabilize.A gravitic field might create a low-energy “null bubble”, reducing both motion artifacts and environmental reflections temporarily.This would manifest as muted frequency content across all bands: minimal gradients, few sharp edges, lower contrast, despite the drone still being present.These values suggest a nonlinear field geometry where sensor exposure is flattened or masked, consistent with propulsion field stasis or transient nulling. In a typical FFT analysis, the high band is normally the lowest. In the drone video, the high band, which represents fine details, sharp edges, and noise, is consistently the highest.
Analyze 9 seconds of truth or
accept a lifetime of lies
The truth always carries the ambiguity of the words used to express it. That is why I’m providing you with access to the original video file to analyze everything yourself! Once again, I encourage you to conduct your own research to arrive at your own conclusion.