Long Beach, California, United States
Contact Info
162K followers
500+ connections
About
Articles by Damien
-
Train, Fine-Tune, and Deploy Large Language Models Bootcamp!
Train, Fine-Tune, and Deploy Large Language Models Bootcamp!
By Damien Benveniste
-
Introduction to Machine Learning System Design!
Introduction to Machine Learning System Design!
By Damien Benveniste
Contributions
Activity
-
How do you deal with imbalanced data? If you don't have too much data and the imbalance is not too extreme, the typical way to deal with it is to…
How do you deal with imbalanced data? If you don't have too much data and the imbalance is not too extreme, the typical way to deal with it is to…
Shared by Damien Benveniste
-
With LangChain, it is not difficult to summarize text of any length. To summarize text with an LLM, there are a few strategies. If the whole text…
With LangChain, it is not difficult to summarize text of any length. To summarize text with an LLM, there are a few strategies. If the whole text…
Shared by Damien Benveniste
-
Starting August 15th, I am going to teach a new Bootcamp: "Train, Fine-Tune, and Deploy Large Language Models Bootcamp"! This is going to be 6 weeks…
Starting August 15th, I am going to teach a new Bootcamp: "Train, Fine-Tune, and Deploy Large Language Models Bootcamp"! This is going to be 6 weeks…
Shared by Damien Benveniste
Experience & Education
Publications
-
Backwards Two-Particle Dispersion in a Turbulent Flow
Phys. Rev. E 89, 041003(R) (2014)
We derive an exact equation governing two-particle backwards mean-squared dispersion for both
deterministic and stochastic tracer particles. For the deterministic trajectories, we probe conse-
quences of our formula for short time and arrive at approximate expressions for the mean squared
dispersion which involve second order structure functions of the velocity and acceleration fields. For
the stochastic trajectories, we analytically compute an exact t3 contribution to the squared…We derive an exact equation governing two-particle backwards mean-squared dispersion for both
deterministic and stochastic tracer particles. For the deterministic trajectories, we probe conse-
quences of our formula for short time and arrive at approximate expressions for the mean squared
dispersion which involve second order structure functions of the velocity and acceleration fields. For
the stochastic trajectories, we analytically compute an exact t3 contribution to the squared sepa-
ration of stochastic paths. We argue that this contribution appears also for deterministic paths at
long times and present direct numerical simulation (DNS) results for incompressible Navier-Stokes
flows to support this claim. We also numerically compute the probability distribution of particle
separations for the deterministic paths and the stochastic paths and show their strong self-similar
nature.Other authors -
-
Diffusion approximation in turbulent two-particle dispersion
Phys. Rev. E Rapid Communication
We solve an inverse problem for fluid particle pair statistics: we showthat a time sequence of probability density functions (PDFs) of separations can be exactly reproduced by solving the diffusion equation with a suitable time-dependent diffusivity. The diffusivity tensor is given by a time integral of a conditional Lagrangian velocity structure function, weighted by a ratio of PDFs. Physical hypotheses for hydrodynamic turbulence (sweeping, short memory, mean-field) yield simpler integral…
We solve an inverse problem for fluid particle pair statistics: we showthat a time sequence of probability density functions (PDFs) of separations can be exactly reproduced by solving the diffusion equation with a suitable time-dependent diffusivity. The diffusivity tensor is given by a time integral of a conditional Lagrangian velocity structure function, weighted by a ratio of PDFs. Physical hypotheses for hydrodynamic turbulence (sweeping, short memory, mean-field) yield simpler integral formulas, including one of Kraichnan and Lundgren (K-L).We evaluate the latter using a space-time database from a numerical Navier-Stokes solution for driven turbulence. The K-L formula reproduces PDFs well at root-mean-square separations, but growth rate of mean-square dispersion is overpredicted due to neglect of memory effects. More general applications of our approach are sketched.
Other authors -
-
Suppression of particle dispersion by sweeping effects in synthetic turbulence
Phys. Rev. E
Synthetic models of Eulerian turbulence like so-called kinematic simulations (KS) are often used as
computational shortcuts for studying Lagrangian properties of turbulence. These models have been criticized by Thomson and Devenish (2005), who argued on physical grounds that sweeping decorrelation effects suppress pair dispersion in such models.We derive analytical results for Eulerian turbulence modeled by Gaussian random fields, in particular for the case with zero mean velocity. Our…Synthetic models of Eulerian turbulence like so-called kinematic simulations (KS) are often used as
computational shortcuts for studying Lagrangian properties of turbulence. These models have been criticized by Thomson and Devenish (2005), who argued on physical grounds that sweeping decorrelation effects suppress pair dispersion in such models.We derive analytical results for Eulerian turbulence modeled by Gaussian random fields, in particular for the case with zero mean velocity. Our starting point is an exact integrodifferential equation for the particle pair separation distribution obtained from the Gaussian integration-by-parts identity. When memory times of particle locations are short, a Markovian approximation leads to a Richardson-type diffusion model.We obtain a time-dependent pair diffusivity tensor of the form Kij (r,t ) = Sij(r)τ(r,t ), where Sij(r) is the structure-function tensor and τ(r,t ) is an effective correlation time of velocity increments. Crucially, this is found to be the minimum value of three times: the intrinsic turnover time τ_{eddy}(r) at separation r, the overall evolution time t, and the sweeping time r/v_0 with v_0 the rms velocity. We study the diffusion model numerically by a Monte Carlo method. With inertial ranges like the largest achieved in most current KS (about 6 decades long), our model is found to reproduce the t^{9/2} power law for pair dispersion predicted by Thomson and Devenish and observed in the KS. However, for much longer ranges, our model exhibits three distinct pair-dispersion laws in the inertial range: a Batchelor t^2 regime, followed by a Kraichnan-model-like t^1 diffusive regime, and then a t^6 regime. Finally, outside the inertial range, there is another t 1 regime with particles undergoing independent Taylor diffusion. These scalings are exactly the same as those predicted by Thomson and Devenish for KS with large mean velocities, which we argue hold also for KS with zero mean velocity.Other authors -
Patents
-
Systems And Methods For Improving The Interpretability And Transparency Of Machine Learning Models
Issued US 16/137200
Embodiments herein provide for a machine learning algorithm that generates models that are more interpretable and transparent than existing machine learning approaches. These embodiments identify, at a record level, the effect of individual input variables on the machine learning model. To provide those improvements, a reason code generator assigns monotonic relationships to a series of input variables, which are then incorporated into the machine learning algorithm as metadata. In some…
Embodiments herein provide for a machine learning algorithm that generates models that are more interpretable and transparent than existing machine learning approaches. These embodiments identify, at a record level, the effect of individual input variables on the machine learning model. To provide those improvements, a reason code generator assigns monotonic relationships to a series of input variables, which are then incorporated into the machine learning algorithm as metadata. In some embodiments, the reason code generator creates records based on the monotonic relationships, which are used by the machine learning algorithm to generate predicted values. The reason code generator compares an original predicted value from the machine learning model to the predicted values from the machine learning model.
Projects
Languages
-
English
Native or bilingual proficiency
-
French
Native or bilingual proficiency
-
Spanish
Elementary proficiency
Organizations
-
JHU Economics and Finance Club
President
- -
The JHU Quant Trading Group
President
- -
1st Trading Competition in Johns Hopkins University
Co-chair
- -
Princeton – UChicago Quant Trading Conference 2013
Member of the organizing committee
- -
JHU Brazilian Jiu-jitsu club
Instructor
-
Recommendations received
1 person has recommended Damien
Join now to viewOther similar profiles
Explore collaborative articles
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
Explore More