Dawn and fall of non-Gaussianity in the quantum parametric oscillator

Systems of coupled optical parametric oscillators (OPOs) forming an Ising machine are emerging as large-scale simulators of the Ising model. The advances in computer science and nonlinear optics have triggered not only the physical realization of hybrid (electro-optical) or all-optical Ising machines, but also the demonstration of quantum-inspired algorithms boosting their performances. To date, the use of the quantum nature of parametrically generated light as a further resource for computation represents a major open issue. A key quantum feature is the non-Gaussian character of the system state across the oscillation threshold. In this paper, we perform an extensive analysis of the emergence of non-Gaussianity in the single quantum OPO with an applied external field. We model the OPO by a Lindblad master equation, which is numerically solved by an ab initio method based on exact diagonalization. Non-Gaussianity is quantified by means of three different metrics: Hilbert-Schmidt distance, quantum relative entropy, and photon distribution. Our findings reveal a nontrivial interplay between parametric drive and applied field: (i) Increasing pump monotonously enhances non-Gaussianity, and (ii) Increasing field first sharpens non-Gaussianity, and then restores the Gaussian character of the state when above a threshold value.

https://arxiv.org/abs/2312.16530

Exponential improvement in combinatorial optimization by hyperspins

Classical or quantum physical systems can simulate the Ising Hamiltonian for large-scale optimization and machine learning. However, devices such as quantum annealers and coherent Ising machines suffer an exponential drop in the probability of success in finite-size scaling. We show that by exploiting high dimensional embedding of the Ising Hamiltonian and subsequent dimensional annealing, the drop is counteracted by an exponential improvement in the performance. Our analysis relies on extensive statistics of the convergence dynamics by high-performance computing. We propose a realistic experimental implementation of the new annealing device by off-the-shelf coherent Ising machine technology. The hyperscaling heuristics can also be applied to other quantum or classical Ising machines by engineering nonlinear gain, loss, and non-local couplings.

Hyperscaling in the coherent hyperspin machine

https://arxiv.org/abs/2308.02329

Biosensing with free space whispering gallery mode microlasers

Highly accurate biosensors for few or single molecule detection play a central role in numerous key fields, such as healthcare and environmental monitoring. In the last decade, laser biosensors have been investigated as proofs of concept, and several technologies have been proposed. We here propose a demonstration of polymeric whispering gallery microlasers as biosensors for detecting small amounts of proteins down to 400 pg. They have the advantage of working in free space without any need for waveguiding for input excitation or output signal detection. The photonic microsensors can be easily patterned on microscope slides and operate in air and solution. We estimate the limit of detection up to 148 nm/RIU for three different protein dispersions. In addition, the sensing ability of passive spherical resonators in the presence of dielectric nanoparticles that mimic proteins is described by massive ab initio numerical simulations.

https://doi.org/10.1364/PRJ.477139

3D+1 Quantum Nonlocal Solitons with Gravitational Interaction

https://arxiv.org/abs/2202.10741

Nonlocal quantum fluids emerge as dark-matter models and tools for quantum simulations and technologies. However, strongly nonlinear regimes, like those involving multi-dimensional self-localized solitary waves (nonlocal solitons), are marginally explored for what concerns quantum features. We study the dynamics of 3D+1 solitons in the second-quantized nonlocal nonlinear Schroedinger equation. We theoretically investigate the quantum diffusion of the soliton center of mass and other parameters, varying the interaction length. 3D+1 simulations of the Ito partial differential equations arising from the positive P-representation of the density matrix validate the theoretical analysis. The numerical results unveil the onset of non-Gaussian statistics of the soliton, which may signal quantum-gravitational effects and be a resource for quantum computing. The non-Gaussianity arises from the interplay of the quantum diffusion of the soliton parameters and the stable invariant propagation. The fluctuations and the non-Gaussianity are universal effects expected for any nonlocality and dimensiona

To Python or not to Python, to C++ or not to C++ (with MATLAB, MPI, CUDA, and FORTRAN)

Programming in C is the best for scientific computing.

You certainly disagree; tools like MATLAB increase productivity.

I remember when I started working as a researcher, with a lot of computing. I was facing with my professors used to dinosaurs like FORTRAN or C. MATLAB was not used too much, it was more than 20 years ago!

But MATLAB is a very professional tool, it works like a charm, and many scientific papers are done by MATLAB.

Nowadays, however, most of the students love and want to use Python, or more fashionable things like Julia or R. Python is a beauty, and it is a must for machine learning. Python is free (but many universities give access to MATLAB to the students). But I do not find it very professional. You continuously need to tweak the code, or install a missing package, or -worst of all- check with the filesystem permissions or access, because it is an interpreted language. MATLAB is also interpreted, but its ecosystem is stable and well-integrated in operating systems like Windows, OSX, or Linux. Of more than 200 papers that I wrote, only in one so far I used a Python code.

At the end of the day, for professional pictures, I use MATLAB (with some help from Illustrator or Powerpoint or Gimp, etc.). Many codes in my papers are written in MATLAB, as in the recent work on neuromorphic computing with waves. Also, the deep learning toolbox of MATLAB is valuable.

I made some papers on parallel computing, mainly by the MPI protocol. In the beginning, for MPI, I used FORTRAN, but lately (nearly 15 years ago) I switched to C++. I am still writing codes with MPI and C++, and I am practicing CUDA. You can use CUDA in Python, but you understand CUDA only by C++.

But as far as I enter in the details of a code (and I age), improve or optimize, I realize that I am progressively switching back to C. Just pure C (sic!). The reason is that at a lower programming level, I have better control of what the code does, I can understand better the side effects of some routine. In C, dealing with complex variables, or arrays is more clear to me, despite being much more complicated (but using pointers makes you feel a lot smarter!).

As a side effect, the code is simpler to read and understand, but much less cool and modern. Even if, I have to admit that maintaining a code in C++ is still more efficient for me, with respect to FORTRAN or to C, Notably enough, my last FORTRAN paper is dated 2017!

I am not a boomer, so you cannot say “ok boomer”, but I think that this python-mania is not the best for scientific computing and is not the best for students. It not only an issue of speed (and obviously C is the fastest, with FORTRAN as a good competitor). It is also a matter of how to learn to write programs for scientific computing from scratch. For me, to learn and practice scientific computing, the best is still a wise combination of C and C++, with MATLAB for visualization! While Python (and TensorFlow and all of that) gives its best in machine learning.