Investigating Bell’s Theorem on Correlated Systems Utilizing Monte Carlo Methods

Presenter Information

Ryan Corbin

Document Type

Oral Presentation

Campus where you would like to present

SURC 140

Start Date

16-5-2013

End Date

16-5-2013

Abstract

Bell’s inequality is an expression which can be employed in order to test for the existence of hidden variables in a correlated system. The idea of hidden variables was introduced as a potential explanation for the quantum mechanical phenomenon of non-deterministic results of measurements. The existence of hidden variables can be tested experimentally. First, a particular measurement of one component of a correlated system must be associated with a positive or negative integer value. There must be an equal probability density of obtaining either measurement. Following this, an ensemble average of these measurements must be performed and Bell’s inequality must subsequently be employed. This theorem was essential in settling the EPR argument against quantum mechanical indeterminism, experimentally showing through violation of the inequality that there are indeed no hidden variables in a quantum system. For a system exhibiting classical behavior, Bell’s inequality should not be violated, provided the correlated system meets criteria appropriate to the assumptions underlying Bell’s theorem. A Mathematica model of Bell’s inequality was developed utilizing a Monte Carlo approach of randomizing parameters in order to test models of correlated systems. The model appears to agree with classical determinism within one standard deviation. It is possible to test both classical and quantum mechanical systems if properly correlated, and it is of future interest to test whether this extends to chaotic systems.

Faculty Mentor(s)

Michael Braunstein

Additional Mentoring Department

Physics

This document is currently not available here.

Share

COinS
 
May 16th, 1:50 PM May 16th, 2:10 PM

Investigating Bell’s Theorem on Correlated Systems Utilizing Monte Carlo Methods

SURC 140

Bell’s inequality is an expression which can be employed in order to test for the existence of hidden variables in a correlated system. The idea of hidden variables was introduced as a potential explanation for the quantum mechanical phenomenon of non-deterministic results of measurements. The existence of hidden variables can be tested experimentally. First, a particular measurement of one component of a correlated system must be associated with a positive or negative integer value. There must be an equal probability density of obtaining either measurement. Following this, an ensemble average of these measurements must be performed and Bell’s inequality must subsequently be employed. This theorem was essential in settling the EPR argument against quantum mechanical indeterminism, experimentally showing through violation of the inequality that there are indeed no hidden variables in a quantum system. For a system exhibiting classical behavior, Bell’s inequality should not be violated, provided the correlated system meets criteria appropriate to the assumptions underlying Bell’s theorem. A Mathematica model of Bell’s inequality was developed utilizing a Monte Carlo approach of randomizing parameters in order to test models of correlated systems. The model appears to agree with classical determinism within one standard deviation. It is possible to test both classical and quantum mechanical systems if properly correlated, and it is of future interest to test whether this extends to chaotic systems.