http://theorie2.physik.uni-erlangen.de/api.php?action=feedcontributions&user=ThomasFoesel&feedformat=atomInstitute for Theoretical Physics II / University of Erlangen-Nuremberg - User contributions [en]2020-02-23T00:41:09ZUser contributionsMediaWiki 1.26.2http://theorie2.physik.uni-erlangen.de/index.php?title=File:Test.py&diff=5339File:Test.py2017-05-18T13:59:41Z<p>ThomasFoesel: </p>
<hr />
<div></div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=53162017 Machine Learning for Physicists, by Florian Marquardt2017-05-13T14:37:07Z<p>ThomasFoesel: fixed stupid copy-paste mistake</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Mailing list''': If you are a regular student, please join the [https://www.studon.fau.de/studon/ilias.php?ref_id=1877317&cmdClass=ilcourseregistrationgui&cmd=show&cmdNode=r4:h3:72&baseClass=ilRepositoryGUI studon course "Machine Learning for Physicists 2017"]. If you are a PhD student (without a studon account), please send an email to [mailto:marquardt-office@mpl.mpg.de marquardt-office@mpl.mpg.de] (Gesine Murphy), with the subject line "MACHINE LEARNING". Then you will be added to a mailing list. <br />
* '''Time/place''': Monday 18:00-20:00 and Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Second lecture''': Thursday, May 11, 18:00, lecture hall D (!)<br />
* '''Further lecture times''': See time table below. We are still figuring out the place: we try to reserve the largest lecture hall available. You will find the place here.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 50s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
=== Files ===<br />
<br />
* [[Media:MachineLearning2017_1.pdf | PDF Slides Lecture 1 (8.5.2017)]]<br />
* Here you would find the python source code, once we find out how to upload that into the wiki (which refuses code...)<br />
* [[Media:MachineLearning2017_2_v2.pdf | PDF Slides Lecture 2 v2 (11.5.2017)]]<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
The following instructions should be quite detailed and easy to follow. If you nevertheless encounter a problem which you cannot solve for yourself, please write an email to [mailto:thomas.foesel@fau.de Thomas Foesel].<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download the installation script for the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row). In the terminal, go into the directory of this file (<code>$ cd ...</code>) and run <code># bash Miniconda3-latest-MacOSX-x86_64.sh</code>.<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/><code>$ conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>$ source activate neuralnets</code>.<br />
*# <code>$ conda install numpy scipy mkl nose sphinx theano pygpu yaml hdf5 h5py jupyter matplotlib</code><br />
*# <code>$ pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: protecting Jupyter ====<br />
<br />
<span style="color:red">'''Important:'''</span> If you intend to run Jupyter on a multi-user system (like the CIP pool), it is '''absolutely necessary''' to protect it against arbitrary code execution by other users. The instructions can be found [http://testnb.readthedocs.io/en/stable/examples/Notebook/Configuring%20the%20Notebook%20and%20Server.html here].<br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
# Load Keras into Python (this command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize the ".keras" folder):<br />
#* on Linux: <code>$ python3 -c "import keras"</code><br />
#* on Mac: <code>$ source activate neuralnets; python -c "import keras"</code><br />
#* on Windows:<br/><code>activate neuralnets</code><br/><code>python -c "import keras"</code><br />
# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano". To do that,<br />
#* on Linux/Mac: open file "~/.keras/keras.json" in your home directory with your preferred text editor (either with command line editors like <code>$ vi ~/.keras/keras.json</code>, <code>$ emacs ~/.keras/keras.json</code> and <code>$ nano ~/.keras/keras.json</code>, or any graphical text editor)<br />
#* on Windows:<br/><code>cd %USERPROFILE%</code><br><code>notepad .keras/keras.json</code><br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you :<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
[[media:keras_minimal.txt|Minimal example for Keras]]<br />
<br />
To check this, download the scripts, rename the file extension from ".txt" to ".py", and execute them<br />
* on Linux: <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code><br />
* on Mac (with Miniconda):<br />
*# <code>$ source activate neuralnets</code> (has to be done once in each new shell session)<br />
*# <code>$ python <script.py></code>, e.g. <code>$ python theano_minimal.py</code><br />
* on Windows (with Miniconda):<br />
*# <code>activate neuralnets</code> (has to be done once in each new shell session)<br />
*# <code>python.exe <script.py></code>, e.g. <code>$ python.exe theano_minimal.py</code> (in the Conda shell, also <code>python <script.py></code> should work)<br />
<br />
In the same way, you should also be able to execute your own Python scripts. If you call <code>$ python3</code>/<code>$ python</code>/<code>python.exe</code> without an argument, an interactive session is started, i.e. you can directly enter Python commands into the terminal.<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code> (will automatically open a browser tab where you can work).<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=53152017 Machine Learning for Physicists, by Florian Marquardt2017-05-13T14:35:49Z<p>ThomasFoesel: more details on the execution of Python scripts</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Mailing list''': If you are a regular student, please join the [https://www.studon.fau.de/studon/ilias.php?ref_id=1877317&cmdClass=ilcourseregistrationgui&cmd=show&cmdNode=r4:h3:72&baseClass=ilRepositoryGUI studon course "Machine Learning for Physicists 2017"]. If you are a PhD student (without a studon account), please send an email to [mailto:marquardt-office@mpl.mpg.de marquardt-office@mpl.mpg.de] (Gesine Murphy), with the subject line "MACHINE LEARNING". Then you will be added to a mailing list. <br />
* '''Time/place''': Monday 18:00-20:00 and Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Second lecture''': Thursday, May 11, 18:00, lecture hall D (!)<br />
* '''Further lecture times''': See time table below. We are still figuring out the place: we try to reserve the largest lecture hall available. You will find the place here.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 50s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
=== Files ===<br />
<br />
* [[Media:MachineLearning2017_1.pdf | PDF Slides Lecture 1 (8.5.2017)]]<br />
* Here you would find the python source code, once we find out how to upload that into the wiki (which refuses code...)<br />
* [[Media:MachineLearning2017_2_v2.pdf | PDF Slides Lecture 2 v2 (11.5.2017)]]<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
The following instructions should be quite detailed and easy to follow. If you nevertheless encounter a problem which you cannot solve for yourself, please write an email to [mailto:thomas.foesel@fau.de Thomas Foesel].<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download the installation script for the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row). In the terminal, go into the directory of this file (<code>$ cd ...</code>) and run <code># bash Miniconda3-latest-MacOSX-x86_64.sh</code>.<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/><code>$ conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>$ source activate neuralnets</code>.<br />
*# <code>$ conda install numpy scipy mkl nose sphinx theano pygpu yaml hdf5 h5py jupyter matplotlib</code><br />
*# <code>$ pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: protecting Jupyter ====<br />
<br />
<span style="color:red">'''Important:'''</span> If you intend to run Jupyter on a multi-user system (like the CIP pool), it is '''absolutely necessary''' to protect it against arbitrary code execution by other users. The instructions can be found [http://testnb.readthedocs.io/en/stable/examples/Notebook/Configuring%20the%20Notebook%20and%20Server.html here].<br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
# Load Keras into Python (this command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize the ".keras" folder):<br />
#* on Linux: <code>$ python3 -c "import keras"</code><br />
#* on Mac: <code>$ source activate neuralnets; python -c "import keras"</code><br />
#* on Windows:<br/><code>activate neuralnets</code><br/><code>python -c "import keras"</code><br />
# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano". To do that,<br />
#* on Linux/Mac: open file "~/.keras/keras.json" in your home directory with your preferred text editor (either with command line editors like <code>$ vi ~/.keras/keras.json</code>, <code>$ emacs ~/.keras/keras.json</code> and <code>$ nano ~/.keras/keras.json</code>, or any graphical text editor)<br />
#* on Windows:<br/><code>cd %USERPROFILE%</code><br><code>notepad .keras/keras.json</code><br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you :<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
[[media:keras_minimal.txt|Minimal example for Keras]]<br />
<br />
To check this, download the scripts, rename the file extension from ".txt" to ".py", and execute them<br />
* on Linux: <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code><br />
* on Mac (with Miniconda):<br />
*# <code>$ source activate neuralnets</code> (has to be done once in each new shell session)<br />
*# <code>$ python <script.py></code>, e.g. <code>$ python theano_minimal.py</code><br />
* on Mac (with Miniconda):<br />
*# <code>activate neuralnets</code> (has to be done once in each new shell session)<br />
*# <code>python.exe <script.py></code>, e.g. <code>$ python.exe theano_minimal.py</code> (in the Conda shell, also <code>python <script.py></code> should work)<br />
<br />
In the same way, you should also be able to execute your own Python scripts. If you call <code>$ python3</code>/<code>$ python</code>/<code>python.exe</code> without an argument, an interactive session is started, i.e. you can directly enter Python commands into the terminal.<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code> (will automatically open a browser tab where you can work).<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=53142017 Machine Learning for Physicists, by Florian Marquardt2017-05-12T11:50:24Z<p>ThomasFoesel: fixed one "bug"</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Mailing list''': If you are a regular student, please join the [https://www.studon.fau.de/studon/ilias.php?ref_id=1877317&cmdClass=ilcourseregistrationgui&cmd=show&cmdNode=r4:h3:72&baseClass=ilRepositoryGUI studon course "Machine Learning for Physicists 2017"]. If you are a PhD student (without a studon account), please send an email to [mailto:marquardt-office@mpl.mpg.de marquardt-office@mpl.mpg.de] (Gesine Murphy), with the subject line "MACHINE LEARNING". Then you will be added to a mailing list. <br />
* '''Time/place''': Monday 18:00-20:00 and Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Second lecture''': Thursday, May 11, 18:00, lecture hall D (!)<br />
* '''Further lecture times''': See time table below. We are still figuring out the place: we try to reserve the largest lecture hall available. You will find the place here.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 50s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
=== Files ===<br />
<br />
* [[Media:MachineLearning2017_1.pdf | PDF Slides Lecture 1 (8.5.2017)]]<br />
* Here you would find the python source code, once we find out how to upload that into the wiki (which refuses code...)<br />
* [[Media:MachineLearning2017_2_v2.pdf | PDF Slides Lecture 2 v2 (11.5.2017)]]<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
The following instructions should be quite detailed and easy to follow. If you nevertheless encounter a problem which you cannot solve for yourself, please write an email to [mailto:thomas.foesel@fau.de Thomas Foesel].<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download the installation script for the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row). In the terminal, go into the directory of this file (<code>$ cd ...</code>) and run <code># bash Miniconda3-latest-MacOSX-x86_64.sh</code>.<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/><code>$ conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>$ source activate neuralnets</code>.<br />
*# <code>$ conda install numpy scipy mkl nose sphinx theano pygpu yaml hdf5 h5py jupyter matplotlib</code><br />
*# <code>$ pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: protecting Jupyter ====<br />
<br />
<span style="color:red">'''Important:'''</span> If you intend to run Jupyter on a multi-user system (like the CIP pool), it is '''absolutely necessary''' to protect it against arbitrary code execution by other users. The instructions can be found [http://testnb.readthedocs.io/en/stable/examples/Notebook/Configuring%20the%20Notebook%20and%20Server.html here].<br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
# Load Keras into Python (this command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize the ".keras" folder):<br />
#* on Linux: <code>$ python3 -c "import keras"</code><br />
#* on Mac: <code>$ source activate neuralnets; python -c "import keras"</code><br />
#* on Windows:<br/><code>activate neuralnets</code><br/><code>python -c "import keras"</code><br />
# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano". To do that,<br />
#* on Linux/Mac: open file "~/.keras/keras.json" in your home directory with your preferred text editor (either with command line editors like <code>$ vi ~/.keras/keras.json</code>, <code>$ emacs ~/.keras/keras.json</code> and <code>$ nano ~/.keras/keras.json</code>, or any graphical text editor)<br />
#* on Windows:<br/><code>cd %USERPROFILE%</code><br><code>notepad .keras/keras.json</code><br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you (download the scripts, rename the file extension from ".txt" to ".py", and execute via <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code>):<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
[[media:keras_minimal.txt|Minimal example for Keras]]<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=52672017 Machine Learning for Physicists, by Florian Marquardt2017-05-08T19:15:03Z<p>ThomasFoesel: email address and protecting Jupyter</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Time/place''': This can still be discussed, but for now I have reserved lecture hall F on Monday 18:00-20:00 and on Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Further lecture times''': See time table below.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 70s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
The following instructions should be quite detailed and easy to follow. If you nevertheless encounter a problem which you cannot solve for yourself, please write an email to [mailto:thomas.foesel@fau.de Thomas Foesel].<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download the installation script for the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row). In the terminal, go into the directory of this file (<code>$ cd ...</code>) and run <code># bash Miniconda3-latest-MacOSX-x86_64.sh</code>.<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/><code>$ conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>$ activate neuralnets</code>.<br />
*# <code>$ conda install numpy scipy mkl nose sphinx theano pygpu yaml hdf5 h5py jupyter matplotlib</code><br />
*# <code>$ pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: protecting Jupyter ====<br />
<br />
<span style="color:red">'''Important:'''</span> If you intend to run Jupyter on a multi-user system (like the CIP pool), it is '''absolutely necessary''' to protect it against arbitrary code execution by other users. The instructions can be found [http://testnb.readthedocs.io/en/stable/examples/Notebook/Configuring%20the%20Notebook%20and%20Server.html here].<br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
# Load Keras into Python (this command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize the ".keras" folder):<br />
#* on Linux: <code>$ python3 -c "import keras"</code><br />
#* on Mac: <code>$ source activate neuralnets; python -c "import keras"</code><br />
#* on Windows:<br/><code>activate neuralnets</code><br/><code>python -c "import keras"</code><br />
# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano". To do that,<br />
#* on Linux/Mac: open file "~/.keras/keras.json" in your home directory with your preferred text editor (either with command line editors like <code>$ vi ~/.keras/keras.json</code>, <code>$ emacs ~/.keras/keras.json</code> and <code>$ nano ~/.keras/keras.json</code>, or any graphical text editor)<br />
#* on Windows:<br/><code>cd %USERPROFILE%</code><br><code>notepad .keras/keras.json</code><br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you (download the scripts, rename the file extension from ".txt" to ".py", and execute via <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code>):<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
[[media:keras_minimal.txt|Minimal example for Keras]]<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=52642017 Machine Learning for Physicists, by Florian Marquardt2017-05-08T14:27:03Z<p>ThomasFoesel: Miniconda for Mac</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Time/place''': This can still be discussed, but for now I have reserved lecture hall F on Monday 18:00-20:00 and on Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Further lecture times''': See time table below.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 70s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download the installation script for the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row). In the terminal, go into the directory of this file (<code>$ cd ...</code>) and run <code># bash Miniconda3-latest-MacOSX-x86_64.sh</code>.<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/><code>$ conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>$ activate neuralnets</code>.<br />
*# <code>$ conda install numpy scipy mkl nose sphinx theano pygpu yaml hdf5 h5py jupyter matplotlib</code><br />
*# <code>$ pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Because there are more recent Conda versions than on the website, update it via <code>conda update conda</code>.<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
# Load Keras into Python (this command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize the ".keras" folder):<br />
#* on Linux: <code>$ python3 -c "import keras"</code><br />
#* on Mac: <code>$ source activate neuralnets; python -c "import keras"</code><br />
#* on Windows:<br/><code>activate neuralnets</code><br/><code>python -c "import keras"</code><br />
# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano". To do that,<br />
#* on Linux/Mac: open file "~/.keras/keras.json" in your home directory with your preferred text editor (either with command line editors like <code>$ vi ~/.keras/keras.json</code>, <code>$ emacs ~/.keras/keras.json</code> and <code>$ nano ~/.keras/keras.json</code>, or any graphical text editor)<br />
#* on Windows:<br/><code>cd %USERPROFILE%</code><br><code>notepad .keras/keras.json</code><br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you (download the scripts, rename the file extension from ".txt" to ".py", and execute via <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code>):<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
[[media:keras_minimal.txt|Minimal example for Keras]]<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=52632017 Machine Learning for Physicists, by Florian Marquardt2017-05-08T14:16:28Z<p>ThomasFoesel: rewritten the part about Keras with Theano backend</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Time/place''': This can still be discussed, but for now I have reserved lecture hall F on Monday 18:00-20:00 and on Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Further lecture times''': See time table below.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 70s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download and install the [https://www.continuum.io/downloads#macos Anaconda collection] (make sure to select Python 3.x, the green button). As this takes several GBs of place on your hard drive (including many packages that you might not need), you can also instead download the smaller [https://conda.io/miniconda.html miniconda] package, and follow the instructions given for Windows below (we have not tested this for Mac, please let us know whether it works).<br />
*# <code># conda install jupyter Theano</code><br />
*# <code># pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
# Load Keras into Python (this command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize the ".keras" folder):<br />
#* on Linux: <code>$ python3 -c "import keras"</code><br />
#* on Mac: <code>$ source activate neuralnets; python -c "import keras"</code><br />
#* on Windows:<br/><code>activate neuralnets</code><br/><code>python -c "import keras"</code><br />
# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano". To do that,<br />
#* on Linux/Mac: open file "~/.keras/keras.json" in your home directory with your preferred text editor (either with command line editors like <code>$ vi ~/.keras/keras.json</code>, <code>$ emacs ~/.keras/keras.json</code> and <code>$ nano ~/.keras/keras.json</code>, or any graphical text editor)<br />
#* on Windows:<br/><code>cd %USERPROFILE%</code><br><code>notepad .keras/keras.json</code><br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you (download the scripts, rename the file extension from ".txt" to ".py", and execute via <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code>):<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
[[media:keras_minimal.txt|Minimal example for Keras]]<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=File:Keras_minimal.txt&diff=5262File:Keras minimal.txt2017-05-08T13:26:50Z<p>ThomasFoesel: ThomasFoesel uploaded a new version of File:Keras minimal.txt</p>
<hr />
<div></div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=File:Keras_minimal.txt&diff=5261File:Keras minimal.txt2017-05-08T13:25:56Z<p>ThomasFoesel: ThomasFoesel uploaded a new version of File:Keras minimal.txt</p>
<hr />
<div></div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=File:Keras_minimal.txt&diff=5260File:Keras minimal.txt2017-05-08T13:00:27Z<p>ThomasFoesel: </p>
<hr />
<div></div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=52072017 Machine Learning for Physicists, by Florian Marquardt2017-05-06T15:18:32Z<p>ThomasFoesel: how to install Theano, Keras, Matplotlib, ... on Windows</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Time/place''': This can still be discussed, but for now I have reserved lecture hall F on Monday 18:00-20:00 and on Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Further lecture times''': See time table below.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 70s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
Note: the monospaced text in this section are commands which have to be executed in a terminal.<br />
* for '''Linux/Mac''': The terminal is simply the system shell. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
* for '''Windows''': Type the commands into the Conda terminal which is part of the Miniconda installation (see below).<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download and install the [https://www.continuum.io/downloads#macos Anaconda collection] (make sure to select Python 3.x, the green button).<br />
*# <code># conda install jupyter Theano</code><br />
*# <code># pip install keras</code><br />
<br />
* Windows<br />
*# Download and install the [https://conda.io/miniconda.html Miniconda collection] (make sure to select Python 3.x, the upper row).<br />
*# Create a Conda environment with <br/> <code>conda create --name neuralnets python=3.5</code> <br/> (note that keras does not run on python 3.6 yet) and activate it via <br/> <code>activate neuralnets</code>.<br />
*# <code>conda install jupyter h5py hdf5 libpython m2w64-toolchain matplotlib mkl-service nose nose-parameterized numpy scipy sphinx theano yaml</code><br />
*# <code>pip install keras</code><br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
* Linux/Mac:<br />
*# <code>$ python3 -c "import keras"</code><br/>This command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize a .keras folder in your home directory<br />
*# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano"<br />
<br />
* Windows: Sorry, we still have to figure out how it works for Windows.<br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you (download the scripts, rename the file extension from ".txt" to ".py", and execute via <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code>):<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
TODO: minimal example for keras<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=52022017 Machine Learning for Physicists, by Florian Marquardt2017-05-02T16:01:09Z<p>ThomasFoesel: Improving the "Minimal examples" section</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Time/place''': This can still be discussed, but for now I have reserved lecture hall F on Monday 18:00-20:00 and on Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Further lecture times''': See time table below.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 70s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
Note for Linux/Mac: the monospaced text in this section are commands which are considered to be executed in the terminal. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download and install the [https://www.continuum.io/downloads#macos Anaconda collection] (make sure to select Python 3.x, the green button).<br />
*# <code># conda install jupyter Theano</code><br />
*# <code># pip install keras</code><br />
<br />
* Windows: Sorry, we still have to figure out how it works for Windows.<br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
* Linux/Mac:<br />
*# <code>$ python3 -c "import keras"</code><br/>This command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize a .keras folder in your home directory<br />
*# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano"<br />
<br />
* Windows: Sorry, we still have to figure out how it works for Windows.<br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you (download the scripts, rename the file extension from ".txt" to ".py", and execute via <code>$ python3 <script.py></code>, e.g. <code>$ python3 theano_minimal.py</code>):<br />
<br />
[[media:theano_minimal.txt|Minimal example for Theano]]<br />
<br />
TODO: minimal example for keras<br />
<br />
[[media:matplotlib_minimal.txt|Minimal example for Matplotlib]]<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=File:Matplotlib_minimal.txt&diff=5201File:Matplotlib minimal.txt2017-05-02T15:55:12Z<p>ThomasFoesel: </p>
<hr />
<div></div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=File:Theano_minimal.txt&diff=5200File:Theano minimal.txt2017-05-02T15:54:47Z<p>ThomasFoesel: </p>
<hr />
<div></div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=2017_Machine_Learning_for_Physicists,_by_Florian_Marquardt&diff=51992017 Machine Learning for Physicists, by Florian Marquardt2017-05-02T15:37:04Z<p>ThomasFoesel: Basis for the software installation guide</p>
<hr />
<div>[[File:MachineLearningHeader.png]]<br />
<br />
=== Basic Information about this Lecture Series ===<br />
<br />
* Contact: [mailto:Florian.Marquardt@fau.de Florian.Marquardt@fau.de]<br />
* 2 hours/week, 5 ECTS credit points<br />
* '''Time/place''': This can still be discussed, but for now I have reserved lecture hall F on Monday 18:00-20:00 and on Thursday, 18:00-20:00. The reason for reserving two slots per week is that I will be traveling quite a bit during the summer term, so in some weeks we will have no lectures, whereas in others there will be two (i.e., on average, one lecture per week). The reason I believe such a late time slot is helpful is that it will not conflict with other lectures or tutorials, hopefully enabling anyone interested to attend. Note: Some students may also wish to attend the lecture on complex systems by Claus Metzner, which has some common themes with the present lecture.<br />
* '''First lecture''': Monday, May 8, 2017; 18:00, lecture hall F<br />
* '''Further lecture times''': See time table below.<br />
<br />
'''Description''': This is a course introducing modern techniques of machine learning, especially deep neural networks, to an audience of physicists. Neural networks can be trained to perform many challenging tasks, including image recognition and natural language processing, just by showing them many examples. While neural networks have been introduced already in the 70s, they really have taken off in the past decade, with spectacular successes in many areas. Often, their performance now surpasses humans, as proven by the recent achievements in handwriting recognition and in [http://www.nature.com/news/google-ai-algorithm-masters-ancient-game-of-go-1.19234 winning the game of 'Go'] against expert human players. They are now also being considered more and more for applications in physics, ranging from predictions of material properties to analyzing phase transitions.<br />
<br />
'''Contents''': We will cover the basics of neural networks (backpropagation), convolutional networks, autoencoders, restricted Boltzmann machines, and recurrent neural networks, as well as the recently emerging applications in physics. Time permitting, we will address other topics, like the relation to spin glass models, curriculum learning, reinforcement learning, adversarial learning, active learning, "robot scientists", deducing nonlinear dynamics, and dynamical neural computers.<br />
<br />
'''Prerequisites''': As a prerequisite you will only need matrix multiplication and the chain rule, i.e. the course will be understandable to bachelor students, master students and graduate students. However, knowledge of any computer programming language will make it much more fun. We will sometimes present examples using the 'python' programming language, which is a modern interpreted language with powerful linear algebra and plotting functions.<br />
<br />
'''Book''': The first parts of the course will rely heavily on the excellent and free online book by Nielsen: [https://neuralnetworksanddeeplearning.com "Neural Networks and Deep Learning"]<br />
<br />
'''Software''': Modern standard computers are powerful enough to run neural networks in a reasonable time. The following list of software packages helps to keep the programming effort low (it is possible to implement advanced structures like a deep convolutional neural network in only a dozen lines of code, which is quite amazing):<br />
<br />
* [https://www.python.org/ '''Python'''] is a widely used high-level programming language for general-purpose programming; both Theano and Keras are Python moduls. We '''highly recommend the usage of the 3.x branch''' (cmp. [https://wiki.python.org/moin/Python2orPython3 Python2 vs Python3]).<br />
* [http://deeplearning.net/software/theano/ '''Theano'''] is a numerical computation library for Python. In Theano, computations are expressed using a NumPy-like syntax and compiled to run efficiently on either CPU or GPU architectures. Therefore, Theano provides the low-level tools (multi-dimensional arrays, convolutional layers, efficient computation of the gradient, ...) needed to implement artificial neural networks.<br />
* [https://keras.io/ '''Keras'''] is a high-level framework for neural networks, running on top of Theano. Designed to enable fast experimentation with deep neural networks, it focuses on being minimal, modular and extensible.<br />
* [http://matplotlib.org/ '''Matplotlib'''] is a plotting library for the Python programming language. We use it to visualize our results.<br />
* [https://jupyter.org/ '''Jupyter'''] is a browser-based application that allows to create and share documents that contain live (Python) code, equations, visualizations and explanatory text. So, Jupyter serves a similar purpose like Mathematica notebooks.<br />
<br />
All the software above is open source and freely available for a large number of platforms. See also the [[#Installation instructions|Installation instructions]] section below.<br />
<br />
<br />
=== Preliminary Schedule ===<br />
<br />
[[File:MachineLearningSchedule.png]]<br />
<br />
=== Installation instructions ===<br />
<br />
Note for Linux/Mac: the monospaced text in this section are commands which are considered to be executed in the terminal. The "#" at the start of the line indicates that root privileges are required (so log in as root via <code>su</code>, or use <code>sudo</code> if this is configured suitably), whereas the commands starting with "$" can be executed as a normal user.<br />
<br />
==== Installing Python, Theano, Keras, Matplotlib and Jupyter ====<br />
<br />
In the following, we show how to install these packages on the three common operating systems. There might be alternative ways to do so; if you prefer another one that works for you, this is also fine, of course.<br />
<br />
* Linux<br />
** Debian/Mint/Ubuntu/...<br />
**# <code># apt-get install python3 python3-dev python3-matplotlib python3-nose python3-numpy python3-pip</code><br />
**# <code># pip3 install jupyter keras Theano</code><br />
** openSUSE<br />
**# <code># zypper in python3 python3-devel python3-jupyter_notebook python3-matplotlib python3-nose python3-numpy-devel</code><br />
**# <code># pip3 install Theano keras</code><br />
<br />
* Mac<br />
*# Download and install the [https://www.continuum.io/downloads#macos Anaconda collection] (make sure to select Python 3.x, the green button).<br />
*# <code># conda install jupyter Theano</code><br />
*# <code># pip install keras</code><br />
<br />
* Windows: Sorry, we still have to figure out how it works for Windows.<br />
<br />
==== Configuration: tell Keras to use the Theano backend ====<br />
<br />
* Linux/Mac:<br />
*# <code>$ python3 -c "import keras"</code><br/>This command will probably fail as it tries to load TensorFlow, but this is OK. Its purpose is to initialize a .keras folder in your home directory<br />
*# edit file ".keras/keras.json" in your home directory: replace "tensorflow" with "theano"<br />
<br />
* Windows: Sorry, we still have to figure out how it works for Windows.<br />
<br />
==== Minimal examples ====<br />
<br />
After the previous steps, the following scripts should work for you:<br />
<br />
TODO: minimal example for theano<br />
<br />
TODO: minimal example for keras<br />
<br />
TODO: minimal example for matplotlib<br />
<br />
In addition, you should be able to start a Jupyter notebook via <code>$ jupyter notebook</code>.<br />
<br />
=== Links ===<br />
<br />
* [http://machinelearningmastery.com/inspirational-applications-deep-learning/ Eight inspirational applications of deep learning], from automatic colorization of images to playing games, by Jason Brownlee<br />
* [https://de.slideshare.net/LuMa921/deep-learning-the-past-present-and-future-of-artificial-intelligence Deep Learning Examples], a great set of slides on a large array of recent deep learning applications, by Lukas Masuch<br />
* [http://karpathy.github.io/neuralnets/ A Hacker's Guide to Neural Networks]: a slightly unconventional, practical introduction to backpropagation, by Andrej Karpathy<br />
* [http://karpathy.github.io/2015/05/21/rnn-effectiveness/ The Unreasonable Effectiveness of Recurrent Neural Networks], on creating fake Shakespeare plays or Wikipedia articles by training neural networks on them (character by character); by Andrej Karpathy</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=Publications_of_the_Marquardt_group&diff=5139Publications of the Marquardt group2017-03-30T11:28:48Z<p>ThomasFoesel: upload Thomas Fösel's "L lines, C points and Chern numbers: understanding band structure topology using polarization fields" paper</p>
<hr />
<div>'''Note: As of August 1st, 2016, the Theory Division of the [http://www.mpl.mpg.de/en/institute/the-institute.html Max Planck Institute for the Science of Light (MPL)] in Erlangen will be led by Florian Marquardt. For the time being, we keep our web presence here, until the MPL website has been established. Read about our publications (below) or about our [[Research|our research topics]].'''<br />
<br />
This page lists publications from the Marquardt group, in reverse chronological order. We start with a few recent highlights. You find the reference list further below.<br />
<br />
{|<br />
| width=33%; valign="top"|[[File:BifurcationA.png|250px|center|link=http://journals.aps.org/pra/abstract/10.1103/PhysRevA.94.033821]]<br />
'''[[media:2016_Kusminskiy_Optomagnonics_PRA.pdf|Coupled spin-light dynamics in cavity optomagnonics]]''' (Viola Kusminskiy, Tang, and Marquardt, 2016) - In optomagnonic cavities, light couples parametrically to magnons via the Faraday effect. This coupling was demonstrated very recently, in two experiments appearing at the end of 2015. In this article, we derive the microscopic Hamiltonian of the system and study the optically induced dynamics of a homogeneous magnon mode. We show that the system exhibits a plethora of nonlinear effects, such as chaos and self-sustained oscillations, which should be tunable and experimentally observable in current setups. <br />
<br />
| width=33%; valign="top"|[[File:deampl.png|300px|center|link=https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.115.243603]]<br />
'''[[media:2015_Peano_de-amplification.pdf|Sensing enhanced by de-amplification]]''' (Peano, Schwefel, Marquardt and Marquardt, 2015) - In this article we show that the precision of position detection can be enhanced by the squeezing generated internally in an optomechanical parametric amplifier. Counterintuitively, the enhancement of the signal-to-noise ratio works by deamplifying precisely the quadrature that is sensitive to the mechanical motion without losing quantum information. <br />
<br />
| width=33%; valign="top"|[[File:Dyngauge.png|300px|center|link=http://iopscience.iop.org/article/10.1088/1367-2630/18/11/113029/meta]]<br />
'''[[media:2015_Walter_DynGaug.pdf|Dynamical Gauge Fields in Optomechanics]]''' (Walter and Marquardt, 2015) - In this article we show that the most basic phonon-assisted photon tunneling process which is due to an optomechanical interaction leads to a scenario where phonons can act as a dynamical gauge field for photons, compared to previously studied static gauge fields. In the optomechanical setting these dynamical gauge fields arise in quite a natural manner. The mechanical oscillation phases determine the effective artificial magnetic field for the photons, and once these phases are allowed to evolve, they respond to the flow of photons in the structure. [http://iopscience.iop.org/article/10.1088/1367-2630/18/11/113029/meta#close New Journal of Physics]<br />
<br />
|-<br />
<br />
| width=33%; valign="top"|[[File:topologicalphasessetupfig.png|center|link=http://www.thp2.nat.uni-erlangen.de/images/a/a4/2014_Peano_topologicalphasesofsoundandlight.pdf]]<br />
'''[http://journals.aps.org/prx/abstract/10.1103/PhysRevX.5.031011 Topological phases of sound and light]''' (Peano, Brendel, Schmidt, Marquardt 2015) - A Phonon Chern insulator is formed when an optomechanical array is driven by a laser with an appropriate pattern of phases. The resulting chiral, topologically<br />
protected phonon transport along the edges can be probed completely<br />
optically. Moreover, we identify a regime of strong mixing between<br />
photon and phonon excitations, which gives rise to a large set of<br />
different topological phases. This work was also highlighted in [http://www.nature.com/nphoton/journal/v9/n10/full/nphoton.2015.189.html#close Nature Photonics].<br />
<br />
<br />
| width=33%; valign="top"|[[File:2015SchmidtPhotonMagneticFields.png|center]]<br />
'''[[media:2015_Schmidt_Kessler_Peano_Marquardt_OMGaugeFields_Optica.pdf|Optomechanical magnetic fields for photons]]''' (Schmidt et al. 2015) - <br />
The optomechanical interaction between mechanical vibrations and light can be used to produce artificial magnetic fields for photons, in a tuneable way that is not tied to the geometry (as other approaches are) and is controlled entirely optically. This work was also highlighted in [http://www.nature.com/nphoton/journal/v9/n9/full/nphoton.2015.172.html Nature Photonics].<br />
<br />
<br />
<br />
| width=33%; valign="top"|[[File:2013Metamaterials.png|200px|center|link=http://iopscience.iop.org/1367-2630/17/2/023025/]]<br />
'''[http://iopscience.iop.org/1367-2630/17/2/023025/ Optomechanical Dirac Physics]''' (Schmidt, Peano, Marquardt 2015) - Photonic crystals with many localized photonic and phononic modes could be used to form 'optomechanical arrays'. This paper predicts that engineering their optomechanical band structure gives access to many phenomena usually known in condensed matter. In particular, we predict optomechanical variants of the Dirac physics known from graphene, now affecting the transport of photon-phonon polaritons on a honeycomb lattice.<br />
<br />
|-<br />
<br />
| width=33%; valign="top"|[[File:2013TangSync.png|center|link=http://thp2.nat.fau.de/images/8/89/2013_Bagheri_OptomechanicalSynchronizationExperiment.pdf]]<br />
'''[[media:2013_Bagheri_OptomechanicalSynchronizationExperiment.pdf|Optomechanical synchronization]]''' (Bagheri et al 2013) - In this experiment of the Tang group at Yale, two 'distant' nanomechanical resonators are coupled via the optical field inside a racetrack optical cavity. Their oscillations are observed to synchronize, which had previously been demonstrated only for disk resonators almost touching each other. In addition, novel features like peculiar sidebands in the observed mechanical spectrum show up. These hint at dynamics beyond the most widely used models of synchronization.<br />
<br />
<br />
| width=33%; valign="top"|[[File:2013KesslerCurrents.png|center|link=http://thp2.nat.fau.de/images/a/ad/2013_Kessler_CurrentStatistics.pdf]]<br />
'''[[media:2013_Kessler_CurrentStatistics.pdf|Where do the currents flow?]]''' (Kessler, Marquardt 2014) - In optical lattices, it is now experimentally possible to detect the precise location of single atoms. This paper suggests that this novel tool could also be used to take 'snapshots' of current patterns. These fluctuating patterns could reveal, via their statistics, important information about quantum many-body states of ultracold atoms, e.g. when an artificial magnetic field is applied. <br />
<br />
<br />
<br />
| width=33%; valign="top"|[[File:2013KronwaldOMITNonlinQR.png|center|link=http://thp2.nat.fau.de/images/d/dc/2013_Kronwald_OMITstrongcoupling.pdf]]<br />
'''[[media:2013_Kronwald_OMITstrongcoupling.pdf|Signatures of quantum nonlinearities]]''' (Kronwald, Marquardt 2013) - Optomechanical experiments are not yet able to observe indications of the nonlinear quantum nature of the optomechanical interaction. However, experiments are coming closer to this "nonlinear quantum regime". In this work, we propose a way how first indications of this nonlinear quantum regime could be observed in a two-tone driving experiment using near-future optomechanical devices.<br />
<br />
|-<br />
<br />
| width=33%; valign="top"|[[File:2013Shuttle.png|center|link=http://thp2.nat.fau.de/images/3/32/2013_Moeckel_ShuttleNonlinearDynamics.pdf]]<br />
'''[[media:2013_Moeckel_ShuttleNonlinearDynamics.pdf|Shuttling electrons, one by one]]''' (Moeckel et al. 2014) - Nanomechanical electron shuttles are little metallic islands that vibrate between electrodes, carrying electrons from one electrode to the other. In principle, they could be exploited to produce a precise current standard, essentially by counting the number of electrons. However, keeping track of the count is not so easy. In this paper, it is shown that the nonlinear dynamics of such a shuttle permits a trick: synchronization of self-oscillations to an external drive. This could drastically increase the precision.<br />
<br />
|<br />
|}<br />
<br />
<br />
== 2017 ==<br />
<br />
* '''L lines, C points and Chern numbers: understanding band structure topology using polarization fields'''<br />
Thomas Fösel, Vittorio Peano, and Florian Marquardt, arXiv:1703.08191 (2017)<br />
[https://arxiv.org/abs/1703.08191 Journal] [[media:Arxiv.1703.08191.pdf|PDF]]<br />
<br />
* '''Snowflake Topological Insulator for Sound Waves'''<br />
Christian Brendel, Vittorio Peano, Oskar Painter, and Florian Marquardt, arXiv:arXiv:1701.06330 (2017)<br />
[https://arxiv.org/abs/1701.06330 Journal] [[media:2017_Brendel_Sonowflake.pdf|PDF]]<br />
<br />
* '''Generalized non-reciprocity in an optomechanical circuit via synthetic magnetism and reservoir engineering'''<br />
Kejie Fang, Jie Luo, Anja Metelmann, Matthew H. Matheny, Florian Marquardt, Aashish A. Clerk, and Oskar Painter, Nature Physics 2017 (Advance Online Publication) [http://www.nature.com/nphys/journal/vaop/ncurrent/full/nphys4009.html Journal]<br />
<br />
* ''' Anderson Localization of Composite Excitations in Disordered Optomechanical Arrays'''<br />
Thales Figueiredo Roque, Vittorio Peano, Oleg M. Yevtushenko, Florian Marquardt, New J. Phys. 19, 013006 (2017)<br />
[http://iopscience.iop.org/article/10.1088/1367-2630/aa52e2/meta Journal] [[media:2016_Figueiredol_AndersonLocalizationNJP.pdf|PDF]]<br />
<br />
== 2016 ==<br />
<br />
* ''' Dynamical Gauge Fields in Optomechanics '''<br />
Stefan Walter and Florian Marquardt, New Journal of Physics 18, 113029 (2016) [http://iopscience.iop.org/article/10.1088/1367-2630/18/11/113029/meta Journal] [[media:2015_Walter_DynGaug.pdf |PDF]]<br />
<br />
* ''' Topological quantum fluctuations and travelling wave amplifiers'''<br />
Vittorio Peano, Martin Houde, Florian Marquardt, and Aashish Clerk, Phys. Rev. X 6, 041026 (2016)<br />
[https://journals.aps.org/prx/abstract/10.1103/PhysRevX.6.041026 Journal][[media:2016_Peano_Topologicalamplifier_PRX.pdf|PDF]]<br />
<br />
* '''Quantum Theory of Continuum Optomechanics'''<br />
Peter Rakich and Florian Marquardt, arXiv:1610.03012 (2016) [https://arxiv.org/abs/1610.03012 Journal] [[media:2016_RakichMarquardt_ContinuumOptomechanics.pdf|PDF]]<br />
<br />
* '''Quantum-coherent phase oscillations in synchronization'''<br />
Talitha Weiss, Stefan Walter, and Florian Marquardt, arXiv:1608.03550 (2016)<br />
[https://arxiv.org/abs/1608.03550 Journal] [[media:2016_Weiss_QuantumCoherentPhaseSync.pdf|PDF]]<br />
<br />
* '''Pseudomagnetic fields for sound at the nanoscale'''<br />
Christian Brendel, Vittorio Peano, Oskar Painter, and Florian Marquardt, arXiv:1607.04321 (2016)<br />
[https://arxiv.org/abs/1607.04321 Journal] [[media:2016_Brendel_PseudoMagneticFields.pdf|PDF]]<br />
<br />
* ''' From Kardar-Parisi-Zhang scaling to explosive desynchronization in arrays of limit-cycle oscillators'''<br />
Roland Lauter, Aditi Mitra, Florian Marquardt, arXiv:1607.03696 (2016)<br />
[https://arxiv.org/abs/1607.03696 Journal] [[media:2016_Lauter_From_KPZ_to_explosive_desync.pdf|PDF]]<br />
<br />
* ''' Quantum Nondemolition Measurement of a Quantum Squeezed State Beyond the 3 dB Limit'''<br />
C. U. Lei, A. J. Weinstein, J. Suh, E. E. Wollman, A. Kronwald, F. Marquardt, A. A. Clerk, and K. C. Schwab, Phys. Rev. Lett. '''117''', 100801 (2016)<br />
[http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.117.100801 Journal] [[media:2016_Lei_QND_msmnt_mech_squeezing_beyond_3db.pdf|PDF]]<br />
<br />
* '''Coupled spin-light dynamics in cavity optomagnonics'''<br />
Silvia Viola Kusminskiy, Hong Tang, and Florian Marquardt, Phys. Rev. A 94, 033821 (2016)<br />
[http://journals.aps.org/pra/abstract/10.1103/PhysRevA.94.033821 Journal][[media:2016_Kusminskiy_Optomagnonics_PRA.pdf|PDF]]<br />
<br />
* '''Many-particle dephasing after a quench'''<br />
Thomas Kiendl and Florian Marquardt, arXiv:1603.01071 [http://arxiv.org/abs/1603.01071 Journal] [[media:2016_Kiendl_QuenchManyParticleDephasing_arxiv.pdf|PDF]]<br />
<br />
* ''' Topological phase transitions and chiral inelastic transport induced by the squeezing of light'''<br />
Vittorio Peano, Martin Houde, Christian Brendel, Florian Marquardt, and Aashish Clerk, Nature Communications '''7''', 10779 (2016)<br />
[http://www.nature.com/ncomms/2016/160302/ncomms10779/full/ncomms10779.html Journal] [[media:2016_Peano_TopologicalPhasesOfSqueezedLight.pdf|PDF]]<br />
<br />
* '''Noise-induced transitions in optomechanical synchronization''' <br />
Talitha Weiss, Andreas Kronwald, and Florian Marquardt, New Journal of Physics '''18''', 013043 (2016) [http://iopscience.iop.org/article/10.1088/1367-2630/18/1/013043 Journal] [[media:2015_Weiss_NoiseSync.pdf|PDF]]<br />
<br />
== 2015 ==<br />
<br />
*'''Quantum simulation of expanding space-time with tunnel-coupled condensates'''<br />
Clemens Neuenhahn and Florian Marquardt, New Journal of Physics '''17''', 125007 (2015) [http://iopscience.iop.org/article/10.1088/1367-2630/17/12/125007 Journal] [[media:2015_NeuenhahnMarquardt_QuantumSimulationExpandingSpacetime.pdf|PDF]]<br />
<br />
*'''Intracavity squeezing can enhance quantum-limited optomechanical position detection through de-amplification'''<br />
V. Peano, H. G. L. Schwefel, Ch. Marquardt, F. Marquardt, Phys. Rev. Lett. '''115''', 243603 (2015) [https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.115.243603 Journal][[media:2015_Peano_de-amplification.pdf |PDF]]<br />
<br />
* '''Position-squared coupling in a tunable photonic crystal optomechanical cavity'''<br />
Taofiq K. Paraiso, Mahmoud Kalaee, Leyun Zang, Hannes Pfeifer, Florian Marquardt, Oskar Painter, Phys. Rev. X '''5''', 041024 (2015)<br />
[https://journals.aps.org/prx/abstract/10.1103/PhysRevX.5.041024 Journal] [[media:2015_Paraiso_PositionSquaredCouplingPRX.pdf|PDF]]<br />
<br />
* ''' Topological Phases of Sound and Light'''<br />
Vittorio Peano, Christian Brendel, Michael Schmidt, and Florian Marquardt, Phys. Rev. X '''5''', 031011 (2015)<br />
[http://journals.aps.org/prx/abstract/10.1103/PhysRevX.5.031011 Journal] [[media:2015_Peano_TopologicalPhasesOfSoundAndLight_PRX.pdf|PDF]] - highlighted in [http://www.nature.com/nphoton/journal/v9/n10/full/nphoton.2015.189.html#close Nature Photonics]<br />
<br />
* '''Magnon dark modes and gradient memory''' <br />
Xufeng Zhang, Chang-Ling Zou, Na Zhu, Florian Marquardt, Liang Jiang, Hong X. Tang, Nature Communications '''6''', 8914 (2015)<br />
[http://www.nature.com/ncomms/2015/151116/ncomms9914/full/ncomms9914.html Journal] [[media:2015_Tang_MagnonGradientMemoryNatComm.pdf|PDF]]<br />
<br />
* '''Quantum squeezing of motion in a mechanical resonator''' <br />
E. E. Wollman, C. U. Lei, A. J. Weinstein, J. Suh, A. Kronwald, F. Marquardt, A. A. Clerk, and K. C. Schwab, Science '''349''', 952 (2015) [http://www.sciencemag.org/content/349/6251/952.full Journal] [[media:2015_Wollman_DissMechSqueezingExperiment.pdf|PDF (preprint)]]<br />
<br />
* '''Optomechanical creation of magnetic fields for photons on a lattice'''<br />
M. Schmidt, S. Keßler, V. Peano, O. Painter, F. Marquardt, Optica '''2''', 635 (2015)<br />
[http://www.osapublishing.org/optica/abstract.cfm?uri=optica-2-7-635 Journal][[media:2015_Schmidt_Kessler_Peano_Marquardt_OMGaugeFields_Optica.pdf|PDF]] -<br />
[http://www.nature.com/nphoton/journal/v9/n9/full/nphoton.2015.172.html Highlighted in Nature Photonics]<br />
<br />
* '''Nonlinear radiation pressure dynamics in an optomechanical crystal'''<br />
Alex G. Krause, Jeff T. Hill, Max Ludwig, Amir H. Safavi-Naeini, Jasper Chan, Florian Marquardt, and Oskar Painter, Phys. Rev. Lett. '''115''', 233601 (2015)<br />
[http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.115.233601 Journal] [[media:2015_Krause_NonlinearDynamics_OptomechanicalCrystalPRL.pdf|PDF]]<br />
<br />
* ''' Optomechanical Dirac Physics'''<br />
Michael Schmidt, Vittorio Peano, and Florian Marquardt, New Journal of Physics '''17''', 023025 (2015)<br />
[http://iopscience.iop.org/1367-2630/17/2/023025/ Journal] [[media:2014_Schmidt_OMDiracPhysics_NJP.pdf|PDF]]<br />
<br />
* '''Pattern phase diagram for 2D arrays of coupled limit-cycle oscillators'''<br />
Roland Lauter, Christian Brendel, Steven J. M. Habraken, and Florian Marquardt, <br />
Phys. Rev. E '''92''', 012902 (2015)<br />
[http://link.aps.org/doi/10.1103/PhysRevE.92.012902 Journal] [[media:2015_Lauter_Brendel_Habraken_Marquardt_Pattern_phase_diagram_PRE.pdf|PDF]]<br />
<br />
== 2014 ==<br />
<br />
* '''Cavity optomechanics'''<br />
Markus Aspelmeyer, Tobias Kippenberg, and Florian Marquardt, <br />
Reviews of Modern Physics 86, 1391 (2014)<br />
[http://journals.aps.org/rmp/abstract/10.1103/RevModPhys.86.1391#abstract Journal] [[media:2014_AKM_OptomechanicsReview.pdf|PDF]]<br />
<br />
* '''Focus on optomechanics'''<br />
Ivan Favero and Florian Marquardt, New Journal of Physics 16, 085006 (2014)<br />
[http://iopscience.iop.org/1367-2630/16/8/085006/ Journal] [[media:2014_FaveroMarquardt_NJP_Focus.pdf|PDF]]<br />
<br />
* '''Decoherence in a double-dot Aharonov-Bohm interferometer: Numerical renormalization group study'''<br />
Björn Kubala, David Roosen, Michael Sindel, Walter Hofstetter, and Florian Marquardt, Phys. Rev. B 90, 035417 (2014)<br />
[http://journals.aps.org/prb/abstract/10.1103/PhysRevB.90.035417 Journal] [[media:2014_Kubala_DoubleDot.pdf|PDF]]<br />
<br />
* '''Entanglement rate for Gaussian continuous variable beams''' <br />
Zhi Jiao Deng, Steven J. M. Habraken and Florian Marquardt,New Journal of Physics 18, 063022 (2016) [http://iopscience.iop.org/article/10.1088/1367-2630/18/6/063022/ Journal] [[media:2014_Deng_Entanglement Rate.pdf|PDF]]<br />
<br />
* '''Cavity Optomechanics: Nano- and Micromechanical Resonators Interacting with Light''' (book)<br />
Editors: Markus Aspelmeyer, Tobias J. Kippenberg, Florian Marquardt, Springer 2014. [http://link.springer.com/book/10.1007/978-3-642-55312-7 Springer Website for this Book]<br />
<br />
* '''Single-site-resolved measurement of the current statistics in optical lattices''' <br />
Stefan Kessler and Florian Marquardt, Phys. Rev. A 89, 061601(R) (2014) [http://link.aps.org/doi/10.1103/PhysRevA.89.061601 Journal] <br />
[[media:2013_Kessler_CurrentStatistics.pdf|PDF]]<br />
<br />
* '''Quantum Optomechanics''' (Les Houches Lecture Notes)<br />
Florian Marquardt, in '''Quantum Machines: Measurement and Control of Engineered Quantum Systems''', eds. Michel Devoret, Benjamin Huard, Robert Schoelkopf, and Leticia F. Cugliandolo, Oxford University Press 2014. See [[media:2014_ChapterDraftLesHouches.pdf|Draft PDF]] and link to [http://ukcatalogue.oup.com/product/9780199681181.do publisher website].<br />
<br />
* '''Synchronizing a single-electron shuttle to an external drive'''<br />
Michael J. Moeckel, Darren R. Southworth, Eva M. Weig, and Florian Marquardt, New Journal of Physics 16, 043009 (2014) [http://iopscience.iop.org/1367-2630/16/4/043009/ Journal] [[media:2013_Moeckel_ShuttleNonlinearDynamics.pdf|PDF]]<br />
<br />
* '''Laser Theory for Optomechanics: Limit Cycles in the Quantum Regime'''<br />
Niels Lörch, Jiang Qian, Aashish Clerk, Florian Marquardt, and Klemens Hammerer, Phys. Rev. X 4, 011015 (2014)<br />
[http://journals.aps.org/prx/abstract/10.1103/PhysRevX.4.011015 Journal] [[media:2014_Loerch_LimitCyclesOptomechanics.pdf|PDF]]<br />
<br />
* '''Dissipative optomechanical squeezing of light''' <br />
Andreas Kronwald, Florian Marquardt, and Aashish A. Clerk, New J. Phys. 16 (2014) 063058 [http://iopscience.iop.org/1367-2630/16/6/063058/ Journal] [[media:2014_Kronwald_DissOMSqueezingOfLight.pdf|PDF]]<br />
<br />
== 2013 ==<br />
<br />
* '''The effect of Landau–Zener dynamics on phonon lasing'''<br />
Huaizhi Wu, Georg Heinrich, and Florian Marquardt, New Journal of Physics 15, 123022 (2013) [http://iopscience.iop.org/1367-2630/15/12/123022/ Journal] [[media:2013_12_WuHeinrichMarquardt_LZS_PhononLasing.pdf|PDF]]<br />
<br />
* '''Optomechanical Metamaterials: Dirac polaritons, Gauge fields, and Instabilities''' <br />
Michael Schmidt, Vittorio Peano and Florian Marquardt, arXiv:1311.7095 [http://arxiv.org/abs/1311.7095 Journal] [[media:2013_Schmidt_OptomechanicalMetamaterials.pdf|PDF]]<br />
<br />
* '''Photonic Cavity Synchronization of Nanomechanical Oscillators'''<br />
M. Bagheri, M. Poot, L. Fan, F. Marquardt, H. X. Tang, Phys. Rev. Lett. 111, 213902 (2013) [http://prl.aps.org/abstract/PRL/v111/i21/e213902 Journal] [[media:2013_Bagheri_OptomechanicalSynchronizationExperiment.pdf|PDF]]<br />
<br />
*'''Quantum many-body dynamics in optomechanical arrays'''<br />
Max Ludwig and Florian Marquardt, Phys. Rev. Lett. 111, 073603 (2013) [http://prl.aps.org/abstract/PRL/v111/i7/e073603 Journal]<br />
[[media:2013_Ludwig_QuantumManyBodyDynamics.pdf|PDF]] <br />
<br />
* '''Arbitrarily large steady-state bosonic squeezing via dissipation''' <br />
Andreas Kronwald, Florian Marquardt, and Aashish A. Clerk, Phys. Rev. A 88, 063833 (2013) [http://pra.aps.org/abstract/PRA/v88/i6/e063833 Journal] [[media:2013_Kronwald_SteadyStateSqueezing.pdf|PDF]]<br />
<br />
* '''Creation and dynamics of remote spin-entangled pairs in the expansion of strongly correlated fermions in an optical lattice'''<br />
Stefan Kessler, Ian P. McCulloch, and Florian Marquardt, New J. Phys. 15, 053043 (2013) [http://iopscience.iop.org/1367-2630/15/5/053043/ Journal]<br />
[[media:2013_Kessler_Entanglement_Expansion.pdf|PDF]]<br />
<br />
* '''Optomechanically Induced Transparency in the Nonlinear Quantum Regime''' <br />
Andreas Kronwald and Florian Marquardt, Phys. Rev. Lett. 111, 133601 (2013) [http://prl.aps.org/abstract/PRL/v111/i13/e133601 Journal] [[media:2013_Kronwald_OMITstrongcoupling.pdf|PDF]] <br />
<br />
* '''The quantum transverse-field Ising chain in circuit QED: effects of disorder on the nonequilibrium dynamics'''<br />
Oliver Viehmann, Jan von Delft, and Florian Marquardt, New J. Phys. 15, 035013 [http://iopscience.iop.org/1367-2630/15/3/035013/ Journal]<br />
[[media:2013_Viehmann_TFIC_disorder.pdf|PDF]]<br />
<br />
* '''Gain-tunable optomechanical cooling in a laser cavity'''<br />
Li Ge, Sanli Faez, Florian Marquardt, Hakan E. Tureci, Phys. Rev. A 87, 053839 (2013) [http://pra.aps.org/abstract/PRA/v87/i5/e053839 Journal]<br />
<br />
*'''Observing the Nonequilibrium Dynamics of the Quantum Transverse-Field Ising Chain in Circuit QED'''<br />
Oliver Viehmann, Jan von Delft, and Florian Marquardt, Phys. Rev. Lett. 110, 030601 (2013) <br />
[http://prl.aps.org/abstract/PRL/v110/i3/e030601 Journal]<br />
[[media:2013_Viehmann_IsingChainInCQED.pdf|PDF]]<br />
<br />
== 2012 ==<br />
<br />
* '''Quantum Signatures of the Optomechanical Instability'''<br />
Jiang Qian, Aashish Clerk, Klemens Hammerer, and Florian Marquardt, Phys. Rev. Lett. 109, 253601 (2012) [http://prl.aps.org/abstract/PRL/v109/i25/e253601 Journal] [[media:2012_Qian_PRL_QuantumSignaturesOptomechanicalInstability.pdf|PDF]]<br />
<br />
* '''Dynamics of levitated nanospheres: towards the strong coupling regime'''<br />
T. S. Monteiro, J. Millen, G. A. T. Pender, Florian Marquardt, D. Chang, and P. F. Barker, New Journal of Physics 15, 015001 [http://iopscience.iop.org/1367-2630/15/1/015001 Journal] [[media:2012_Monteiro_LevitatingSpheresNJP.pdf|PDF]]<br />
<br />
* '''Full photon statistics of a light beam transmitted through an optomechanical system'''<br />
Andreas Kronwald, Max Ludwig, and Florian Marquardt, Phys. Rev. A 87, 013847 (2013) [http://pra.aps.org/abstract/PRA/v87/i1/e013847 Journal] [[media:2012_Kronwald_PRA_FullPhotonStatistics.pdf|PDF]]<br />
<br />
* '''Optomechanical circuits for nanomechanical continuous variable quantum state processing'''<br />
Michael Schmidt, Max Ludwig, and Florian Marquardt, New J. Phys. 14 125005 (2012) [http://stacks.iop.org/1367-2630/14/125005 Journal] [[media:2012_Schmidt_OptmechQuantCircuits.pdf|PDF]] <br />
<br />
* '''Enhanced Quantum Nonlinearities in a Two-Mode Optomechanical System'''<br />
Max Ludwig, Amir H. Safavi-Naeini, Oskar Painter and Florian Marquardt, Phys. Rev. Lett. 109, 063601 (2012) [http://link.aps.org/doi/10.1103/PhysRevLett.109.063601 Journal] [[media:2012_Ludwig_PhononPhotonDetection_PRL.pdf|PDF]] <br />
<br />
* '''Localized phase structures growing out of quantum fluctuations in a quench of tunnel-coupled atomic condensates'''<br />
Clemens Neuenhahn, Anatoli Polkovnikov and Florian Marquardt, Phys. Rev. Lett. 109, 085304 (2012)<br />
[http://link.aps.org/doi/10.1103/PhysRevLett.109.085304 Journal] [[media:2012_NeuenhahnSineGordon.pdf|PDF]] <br />
<br />
* '''Optomechanical cooling of levitated spheres with doubly-resonant fields'''<br />
G. A. T. Pender, P. F. Barker, Florian Marquardt, J. Millen, and T. S. Monteiro, Phys. Rev. A 85, 021802(R) (2012) [http://pra.aps.org/abstract/PRA/v85/i2/e021802 Journal] [[media:2012_Pender_CoolingLevitatedSpheres.pdf|PDF]]<br />
<br />
* '''Stroboscopic observation of quantum many-body dynamics'''<br />
Stefan Kessler, Andreas Holzner, Ian P. McCulloch, Jan von Delft, and Florian Marquardt, Phys. Rev. A 85, 011605(R) (2012) [http://pra.aps.org/abstract/PRA/v85/i1/e011605 Journal] [[media:2011_Kessler_Stroboscopic.pdf|PDF]] [http://www.citeulike.org/group/13403/article/10216549 Cite]<br />
<br />
* '''Observation of spontaneous Brillouin cooling'''<br />
Gaurav Bahl, Matthew Tomes, Florian Marquardt, and Tal Carmon, Nature Physics 8, 203 (2012) [http://www.nature.com/nphys/journal/vaop/ncurrent/pdf/nphys2206.pdf Journal]<br />
<br />
== 2011 ==<br />
<br />
* '''Superradiant Phase Transitions and the Standard Description of Circuit QED'''<br />
Oliver Viehmann, Jan von Delft, and Florian Marquardt, Phys. Rev. Lett. 107, 113602 (2011) [http://prl.aps.org/abstract/PRL/v107/i11/e113602 Journal] [[media:2011_Viehmann_SuperradiantPTs.pdf|PDF]] [http://www.citeulike.org/group/13403/article/9061601 Cite]<br />
<br />
* '''Quantum Mechanical Theory of Optomechanical Brillouin Cooling'''<br />
M. Tomes, F. Marquardt, G. Bahl, and T. Carmon, Physical Review A, 84, 063806 (2011) [http://pra.aps.org/abstract/PRA/v84/i6/e063806 Journal] [[media:2011_Tomes_BrillouinCooling.pdf|PDF]]<br />
<br />
* '''Collective dynamics in optomechanical arrays'''<br />
Georg Heinrich, Max Ludwig, Jiang Qian, Björn Kubala, Florian Marquardt, Phys. Rev. Lett. 107, 043603 (2011) [http://prl.aps.org/abstract/PRL/v107/i4/e043603 Journal] [[media:2010_Heinrich_CollectiveDynamics_PRL.pdf|PDF]] [http://www.citeulike.org/group/13403/article/9572316 Cite]<br />
<br />
* '''Dynamics of coupled multimode and hybrid optomechanical systems'''<br />
Georg Heinrich, Max Ludwig, Huaizhi Wu, K. Hammerer and Florian Marquardt, C. R. Physique 12, 837 (2011) [http://www.sciencedirect.com/science/article/pii/S1631070511000569 Journal] [[media:2011_Heinrich_DynCoupledMultiOptp.pdf|PDF]]<br />
<br />
* '''Coupled multimode optomechanics in the microwave regime'''<br />
Georg Heinrich and Florian Marquardt, Europhys. Lett. 93, 18003 (2011) [http://iopscience.iop.org/0295-5075/93/1/18003/ Journal] [[media:2011_Heinrich_EPLCoupledOptoMW.pdf|PDF]] [http://www.citeulike.org/group/13403/article/8755967 Cite]<br />
<br />
== 2010 ==<br />
<br />
* '''Entanglement of mechanical oscillators coupled to a non-equilibrium environment'''<br />
Max Ludwig, K. Hammerer, Florian Marquardt, Phys. Rev. A 82, 012333 (2010) [http://pra.aps.org/abstract/PRA/v82/i1/e012333 Journal] [[media:2010_LudwigHammererMarquardt_Entanglement_PRA.pdf|PDF]] [http://www.citeulike.org/user/Max_Ludwig/article/7594546 Cite]<br />
<br />
* '''Thermalization of Interacting Fermions and Delocalization in Fock space'''<br />
Clemens Neuenhahn and Florian Marquardt, Phys. Rev. E 85, 060101(R) (2012) [http://link.aps.org/doi/10.1103/PhysRevE.85.060101 Journal]<br />
<br />
* '''Examples of Quantum Dynamics in Optomechanical Systems'''<br />
Max Ludwig, Georg Heinrich and F. Marquardt; in Quantum Communication and Quantum Networking (Springer 2010); proceedings of<br />
QuantumComm 2009, Naples, Italy; [http://www.springerlink.com/content/m85l92288477368t Journal] [http://www.citeulike.org/group/13403/article/7220538 Cite]<br />
<br />
* '''Introduction to Quantum Noise, Measurement and Amplification''' <br />
A. A. Clerk, M. H. Devoret, S. M. Girvin, F. Marquardt, and R. J. Schoelkopf, Rev. Mod. Phys. 82, 1155 (2010)<br />
[http://rmp.aps.org/abstract/RMP/v82/i2/p1155_1 Journal] [[media:2010_Clerk_QuantumNoiseReview.pdf|PDF (main text)]] [[media:2010_Clerk_QNoiseReviewAppendices.pdf|PDF (appendices)]] [http://www.citeulike.org/user/FlorianMarquardt/article/3454550 Cite]<br />
<br />
* '''Quantum Measurement of Phonon Shot Noise'''<br />
Aashish Clerk, Florian Marquardt, Jack Harris, Phys. Rev. Lett. 104, 213603 (2010) [http://link.aps.org/doi/10.1103/PhysRevLett.104.213603 Journal] [[media:2010_05_Clerk_PhononShotNoise.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7225451 Cite]<br />
<br />
* '''Electron-Plasmon scattering in chiral 1D systems with nonlinear dispersion'''<br />
Markus Heyl, Stefan Kehrein, Florian Marquardt, Clemens Neuenhahn, Phys. Rev. B 82, 033409 (2010) [http://prb.aps.org/abstract/PRB/v82/i3/e033409 Journal] [[media:2010_Plasmon.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100241 Cite]<br />
<br />
* '''Dimensional Crossover of the Dephasing Time in Disordered Mesoscopic Rings: From Diffusive through Ergodic to 0D Behavior'''<br />
M. Treiber, O.M. Yevtushenko, F. Marquardt, J. von Delft, I.V. Lerner, in "Perspectives of Mesoscopic Physics - Dedicated to Yoseph Imry's 70th Birthday", edited by Amnon Aharony and Ora Entin-Wohlman (World Scientific, 2010), chap. 20, p. 371-396, ISBN-13 978-981-4299-43-5; arXiv:1001.0479 [http://arxiv.org/abs/1001.0479 Journal] [[media:2010_Treiber_RingDephasingLong.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100245 Cite]<br />
<br />
* '''Single-Atom Cavity QED and Opto-Micromechanics'''<br />
M. Wallquist, K. Hammerer, P. Zoller, C. Genes, M. Ludwig, F. Marquardt, P. Treutlein, J. Ye, H. J. Kimble, Phys. Rev. A 81, 023816 (2010) [http://pra.aps.org/abstract/PRA/v81/i2/e023816 Journal] [[media:2010_Wallquist_AtomLong.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/6740552 Cite]<br />
* '''Optimal control of circuit quantum electrodynamics in one and two dimensions'''<br />
R. Fisher, F. Helmer, S. J. Glaser, F. Marquardt, T. Schulte-Herbrueggen, Phys. Rev. B 81, 085328 (2010) [http://prb.aps.org/abstract/PRB/v81/i8/e085328 Journal] [[media:2010_Fisher_OptimalControl.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100307 Cite]<br />
* '''AC-Conductance through an Interacting Quantum Dot'''<br />
Björn Kubala, Florian Marquardt, Phys. Rev. B 81, 115319 (2010) [http://prb.aps.org/abstract/PRB/v81/i11/e115319 Journal] [[media:2010_Kubala_ACconductance.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100313 Cite]<br />
* '''The photon shuttle: Landau-Zener-Stueckelberg dynamics in an optomechanical system'''<br />
Georg Heinrich, J. G. E. Harris, Florian Marquardt, Phys. Rev. A 81, 011801(R) (2010) [http://pra.aps.org/abstract/PRA/v81/i1/e011801 Journal] [[media:2010_Heinrich_LandauZener.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/6538447 Cite]<br />
<br />
== 2009 ==<br />
<br />
* '''The dephasing rate formula in the many body context'''<br />
Doron Cohen, Jan von Delft, Florian Marquardt, Yoseph Imry, Phys. Rev. B 80, 245410 (2009) [http://prb.aps.org/abstract/PRB/v80/i24/e245410 Journal] [[media:2009_Cohen_DephasingRate.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100317 Cite]<br />
<br />
* '''Toolbox of resonant quantum gates in Circuit QED''' <br />
G. Haack, F. Helmer, M. Mariantoni, F. Marquardt, and E. Solano, Phys. Rev. B 82, 024 514 (2010). [http://arxiv.org/abs/0908.3673 Journal] [[media:2009_08_Haack_ResonantGates.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100501 Cite]<br />
* '''Dimensional Crossover of the Dephasing Time in Disordered Mesoscopic Rings''' <br />
M. Treiber, O. M. Yevtushenko, F. Marquardt, J. v. Delft, and I. V. Lerner, Phys. Rev. B 80, 201305(R) (2009) [http://prb.aps.org/abstract/PRB/v80/i20/e201305 Journal] [[media:2009_Treiber_RingDephasing.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/7100168 Cite]<br />
* '''Optomechanics''' <br />
F. Marquardt and S. M. Girvin, Physics 2, 40 (2009) [http://physics.aps.org/articles/v2/40 Journal] [[media:2009_05_MarquardtGirvin_OptomechanicsReviewPhysics.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/5282943 Cite]<br />
* '''Strong coupling of a mechanical oscillator and a single atom''' <br />
K. Hammerer, M. Wallquist, C. Genes, M. Ludwig, F. Marquardt, P. Treutlein, P. Zoller, J. Ye, H. J. Kimble, Phys. Rev. Lett. 103, 063005 (2009)<br />
[http://prl.aps.org/abstract/PRL/v103/i6/e063005 Journal] [[media:2009_Hammerer_StrongCouplingAtomMembrane.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/5394883 Cite]<br />
* '''Optomechanics''' (proceedings NATO Workshop Tashkent 2008)<br />
B. Kubala, M. Ludwig, and F. Marquardt, arXiv:0902.2163; published in the proceedings of the NATO Advanced Research Workshop ’Recent Advances in Nonlinear Dynamics and Complex System Physics’, Tashkent, Uzbekistan 2008; Springer 2009. <br />
[http://arxiv.org/abs/0902.2163 Journal] [[media:2009_02_OptomechanicsReview_TashkentProceedings.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/4044819 Cite]<br />
* '''Measurement-based Synthesis of multi-qubit Entangled States in Superconducting Cavity QED''' <br />
F. Helmer and F. Marquardt, Phys. Rev. A 79, 052328 (2009) <br />
[http://pra.aps.org/abstract/PRA/v79/i5/e052328 Journal] [[media:2009_05_HelmerMarquardt_EntanglementByMeasurement.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/4935323 Cite]<br />
* '''Recent progress in open quantum systems: Non-Gaussian noise and decoherence in fermionic systems''' <br />
C. Neuenhahn, B. Kubala, B. Abel, and F. Marquardt, physica status solidi (b) 246, 1018 (2009) <br />
[http://www3.interscience.wiley.com/journal/122302278/abstract Journal] [[media:2008_09_Neuenhahn_UstronReview.pdf|PDF]] <br />
[http://www.citeulike.org/group/13403/article/7102335 Cite]<br />
* '''Quantum nondemolition photon detection in circuit QED and the quantum Zeno effect''' <br />
F. Helmer, M. Mariantoni, E. Solano, and F. Marquardt, Phys. Rev. A 79, 052115 (2009) <br />
[http://pra.aps.org/abstract/PRA/v79/i5/e052115 Journal] [[media:2009_05_Helmer_QuantumZenoPhotonDetection.pdf|PDF]] <br />
[http://www.citeulike.org/group/13403/article/5086278 Cite]<br />
* '''Cavity grid for scalable quantum computation with superconducting circuits''' <br />
F. Helmer, M. Mariantoni, A. G. Fowler, J. v. Delft, E. Solano, and F. Marquardt, EPL 85, 50007 (2009) <br />
[http://iopscience.iop.org/0295-5075/85/5/50007 Journal] [[media:2009_03_CavityGrid_EPL.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/5913639 Cite]<br />
* '''Universal Dephasing in a Chiral 1D Interacting Fermion System''' <br />
Clemens Neuenhahn and Florian Marquardt, Physical Review Letters 102, 046806 (2009) <br />
[http://prl.aps.org/abstract/PRL/v102/i4/e046806 Journal] [[media:2009_01_NeuenhahnMarquardt_UniversalDephasing.pdf|PDF]] <br />
[http://www.citeulike.org/group/13403/article/7102359 Cite]<br />
<br />
== 2008 ==<br />
<br />
* '''Dephasing by electron-electron interactions in a ballistic Mach-Zehnder interferometer''' <br />
C. Neuenhahn and F. Marquardt, New Journal of Physics 10, 115018 (2008) [http://iopscience.iop.org/1367-2630/10/11/115018 Journal] [[media:2008_11_MachZehnderNJP.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3656178 Cite]<br />
<br />
* '''Introduction to dissipation and decoherence in quantum systems''' <br />
F. Marquardt and A. P&uuml;ttmann, arXiv:0809.4403 [http://arxiv.org/abs/0809.4403 Journal] [[media:2008_09_MarquardtPuettmann_LectureNotesDecoherence.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3339389 Cite]<br />
<br />
* '''Optomechanics: Push towards the quantum limit''' (News&Views)<br />
F. Marquardt, Nature Physics 4, 513 (2008) [http://www.nature.com/nphys/journal/v4/n7/full/nphys1006.html Journal] (no PDF) [http://www.citeulike.org/group/13403/article/3003144 Cite]<br />
<br />
* '''Dispersive optomechanics: a membrane inside a cavity''' <br />
A. M. Jayich, J. C. Sankey, B. M. Zwickl, C. Yang, J. D. Thompson, S. M. Girvin, A. A. Clerk, F. Marquardt, and J. G. E. Harris, New Journal of Physics 10, 095008 (2008) [http://iopscience.iop.org/1367-2630/10/9/095008 Journal] [[media:2008_05_Jayich_MIM_NJP.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3363153 Cite]<br />
<br />
* '''Decoherence by Quantum Telegraph Noise: A numerical evaluation''' <br />
B. Abel and F. Marquardt, Phys. Rev. B 78, 201302 (R) (2008) [http://prb.aps.org/abstract/PRB/v78/i20/e201302 Journal] [[media:2008_11_AbelMarquardt_QuantumTelegraphNoise.pdf|PDF]] [http://www.citeulike.org/group/13403/article/7102147 Cite]<br />
<br />
* '''The optomechanical instability in the quantum regime''' <br />
M. Ludwig, B. Kubala, and F. Marquardt, New Journal of Physics 10, 095013 (2008) [http://iopscience.iop.org/1367-2630/10/9/095013 Journal] [[media:2008_03_Ludwig_OptomechInstabQuantum.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3363158 Cite]<br />
<br />
* '''Quantum theory of optomechanical cooling''' <br />
F. Marquardt, A. A. Clerk, and S. M. Girvin, Journal of Modern Optics 55, 3329 (2008) [http://www.informaworld.com/smpp/content~content=a906574424&db=all Journal] [[media:2008_03_MarquardtClerkGirvin_ReviewOptoCooling.pdf|PDF]] [http://www.citeulike.org/group/13403/article/7118185 Cite]<br />
<br />
* '''Back-action evasion and squeezing of a mechanical resonator using a cavity detector''' <br />
A. A. Clerk, F. Marquardt, and K. Jacobs, New Journal of Physics 10, 095010 (2008) [http://iopscience.iop.org/1367-2630/10/9/095010 Journal] [[media:2008_02_Clerk_SqueezingByFeedback.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3002134 Cite]<br />
<br />
* '''Self-Induced Oscillations in an Optomechanical System driven by Bolometric Backaction''' <br />
Constanze Metzger, Max Ludwig, Clemens Neuenhahn, Alexander Ortlieb, Ivan Favero, Khaled Karrai, and Florian Marquardt, Phys. Rev. Lett. 101, 133903 (2008) [http://prl.aps.org/abstract/PRL/v101/i13/e133903 Journal] [[media:2008_Metzger_SelfInducedOscillationsInAnOptomechanicalSystemDrivenByBolometricBackaction.pdf|PDF]] [http://www.citeulike.org/group/13403/article/4633438 Cite]<br />
<br />
* '''Mesoscopic Spin-Boson Models of Trapped Ions''' <br />
D. Porras, F. Marquardt, J. von Delft, and J.I. Cirac, Phys. Rev. A (R) 78, 010101 (2008) [http://pra.aps.org/abstract/PRA/v78/i1/e010101 Journal] [[media:2008_Porras_SpinBosonIons.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3033056 Cite]<br />
<br />
* '''Strong dispersive coupling of a high finesse cavity to a micromechanical membrane''' <br />
J. D. Thompson, B. M. Zwickl, A. M. Jayich, F. Marquardt, S. M. Girvin, and J. G. E. Harris, Nature 452, 72 (2008) [http://www.nature.com/nature/journal/v452/n7183/abs/nature06715.html Journal] [[media:2008_Thompson_StrongDispersiveCoupling.pdf|PDF]] [http://www.citeulike.org/group/13403/article/2476650 Cite]<br />
<br />
* '''Measuring the size of a quantum superposition of two many-body states''' <br />
F. Marquardt, B. Abel, and J. v. Delft, Phys. Rev. A 78, 012109 (2008) [http://pra.aps.org/abstract/PRA/v78/i1/e012109 Journal] [[media:2008_MarquardtAbelDelft_CatSize.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3033003 Cite]<br />
<br />
== 2007 ==<br />
<br />
* '''Quantum Theory of cavity-assisted sideband cooling of mechanical motion''' <br />
F. Marquardt, J. P. Chen, A. A. Clerk, and S. M. Girvin, Phys. Rev. Lett. 99, 093902 (2007) [http://prl.aps.org/abstract/PRL/v99/i9/e093902 Journal] [[media:1999_Marquardt_Sideband.pdf|PDF]] [http://www.citeulike.org/article/3972865 Cite]<br />
<br />
* '''Coherence oscillations in dephasing by non-Gaussian shot noise''' <br />
I. Neder and F. Marquardt, New Journal of Physics 9, 112 (2007) <br />
[http://dx.doi.org/10.1088/1367-2630/9/5/112 Journal]<br />
[[media:2007_05_NederMarquardt_NJP_NonGaussianNoise.pdf|PDF]]<br />
[http://www.citeulike.org/user/FlorianMarquardt/article/1287268 Cite]<br />
<br />
* '''Controlled Dephasing of Electrons by Non-Gaussian Shot Noise''' <br />
I. Neder, F. Marquardt, M. Heiblum, D. Mahalu, and V. Umansky, Nature Physics 3, 534 (2007) <br />
[http://www.nature.com/nphys/journal/v3/n8/full/nphys627.html Journal]<br />
[[media:2007_NederMarquardt_NaturePhysics_NonGaussianNoise.pdf|PDF]]<br />
[http://www.citeulike.org/user/FlorianMarquardt/article/3014989 Cite]<br />
<br />
* '''Self-consistent calculation of the electron distribution near a Quantum-Point Contact in the integer Quantum Hall Effect''' <br />
A. Siddiki and F. Marquardt, Phys. Rev. B 75, 045325 (2007) <br />
[http://dx.doi.org/10.1103/PhysRevB.75.045325 Journal] <br />
[[media:2007_01_Siddiki_QPCselfconsistent.pdf|PDF]]<br />
[http://www.citeulike.org/user/FlorianMarquardt/article/3014993 Cite]<br />
<br />
* '''Efficient on-chip source of microwave photon pairs in superconducting circuit QED''' <br />
F. Marquardt, Phys. Rev. B 76, 205416 (2007) <br />
[http://dx.doi.org/10.1103/PhysRevB.76.205416 Journal] <br />
[[media:2007_Marquardt_PDC_CircuitQED.pdf|PDF]]<br />
[http://www.citeulike.org/user/FlorianMarquardt/article/3015003 Cite]<br />
<br />
* '''Decoherence in weak localization I: Pauli principle in influence functional''' <br />
F. Marquardt, J. v. Delft, R. Smith, and V. Ambegaokar, Phys. Rev. B 76, 195331 (2007)<br />
[http://dx.doi.org/10.1103/PhysRevB.76.195331 Journal]<br />
[[media:2007_11_Marquardt_WeakLocOne.pdf|PDF]]<br />
[http://www.citeulike.org/user/FlorianMarquardt/article/3015010 Cite]<br />
<br />
* '''Decoherence in weak localization II: Bethe-Salpeter calculation of Cooperon''' <br />
J. v. Delft, F. Marquardt, R. Smith, and V. Ambegaokar, Phys. Rev. B 76, 195332 (2007) <br />
[http://prb.aps.org/abstract/PRB/v76/i19/e195332 Journal]<br />
[[media:2007_11_vonDelft_WeakLocTwo.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/3015012 Cite]<br />
<br />
== 2006 ==<br />
<br />
* '''Equations of motion approach to decoherence and current noise in ballistic interferometers coupled to a quantum bath''' <br />
F. Marquardt, Phys. Rev. B 74, 125319 (2006) <br />
[http://prb.aps.org/abstract/PRB/v74/i12/e125319 Journal]<br />
[[media:2006_09_MZQB_Long.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/3015006 Cite]<br />
<br />
* '''Decoherence of fermions subject to a quantum bath''' <br />
F. Marquardt, in Advances in Solid State Physics (Springer), Vol. 46, ed. R. Haug [cond-mat/0604626] <br />
[http://www.springerlink.com/content/30368wghpg662515/ Journal]<br />
[[media:2006_04_MarquardtReviewDecoherence.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/7128692 Cite]<br />
<br />
* '''Correlation induced resonances in transport through coupled quantum dots''' <br />
V. Meden and F. Marquardt, Phys. Rev. Lett. 96, 146801 (2006)<br />
[http://prl.aps.org/abstract/PRL/v96/i14/e146801 Journal]<br />
[[media:2006_04_MedenMarquardt_PRL.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/3015007 Cite]<br />
<br />
* '''Dynamical multistability induced by radiation pressure in high-finesse micromechanical optical cavities''' <br />
F. Marquardt, J. G. E. Harris, and S. M. Girvin, Phys. Rev. Lett. 96, 103901 (2006) <br />
[http://prl.aps.org/abstract/PRL/v96/i10/e103901 Journal]<br />
[[media:2006_03_MarquardtHarrisGirvin_CantileverPRL.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/3015016 Cite]<br />
<br />
== 2005 ==<br />
<br />
* '''Fermionic Mach-Zehnder interferometer subject to a quantum bath''' <br />
F. Marquardt, Europhysics Letters 72, 788 (2005) <br />
[http://iopscience.iop.org/0295-5075/72/5/788/ Journal]<br />
[[media:2005_12_MZQB_EPL.pdf|PDF]]<br />
[http://www.citeulike.org/group/13403/article/3015029 Cite]<br />
* '''A many-fermion generalization of the Caldeira-Leggett model''' <br />
F. Marquardt and D. S. Golubev, Phys. Rev. A 72, 022113 (2005) [http://pra.aps.org/abstract/PRA/v72/i2/e022113 Journal] [[media:2005_MarquardtGolubev_ManyFermionCaldeiraLeggett.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015031 Cite]<br />
* '''Spin Relaxation in a Quantum Dot due to Nyquist Noise''' <br />
F. Marquardt and V. A. Abalmassov, Phys. Rev. B 71, 165325 (2005) [http://prb.aps.org/abstract/PRB/v71/i16/e165325 Journal] [[media:2004_04_Abalmassov_NyquistNoiseSpinRelax.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015034 Cite]<br />
<br />
== Before arrival as group leader in Munich (2001-2004) ==<br />
<br />
== 2004 ==<br />
<br />
* '''Perturbative corrections to the Gutzwiller mean-field solution of the Mott-Hubbard model'''<br />
C. Schroll, F. Marquardt, and C. Bruder, Phys. Rev. A 70, 053609 (2004) [http://pra.aps.org/abstract/PRA/v70/i5/e053609 Journal] [[media:2004_04_SchrollEtAl_BeyondGutzwiller.pdf|PDF]] [http://www.citeulike.org/user/OliverViehmann/article/3015067 Cite]<br />
* '''Effects of dephasing on shot noise in an electronic Mach-Zehnder interferometer''' <br />
F. Marquardt and C. Bruder, Phys. Rev. B 70, 125305 (2004) [http://prb.aps.org/abstract/PRB/v70/i12/e125305 Journal] [[media:2004_09_MZ_PRB.pdf|PDF]] [http://www.citeulike.org/user/OliverViehmann/article/3015036 Cite]<br />
* '''Relaxation and Dephasing in a Many-Fermion Generalization of the Caldeira-Leggett Model''' <br />
F. Marquardt and D. S. Golubev, Phys. Rev. Lett. 93, 130404 (2004) [http://prl.aps.org/abstract/PRL/v93/i13/e130404 Journal] [[media:2004_09_ManyFermionCL_PRL.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015042 Cite]<br />
* '''Influence of dephasing on shot noise in an electronic Mach-Zehnder interferometer''' <br />
F. Marquardt and C. Bruder, Phys. Rev. Lett. 92, 056805 (2004) [http://prl.aps.org/abstract/PRL/v92/i5/e056805 Journal] [[media:2004_02_MZ_PRL.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015041 Cite]<br />
* '''Electron-nuclei spin relaxation through phonon-assisted hyperfine interaction in a quantum dot''' <br />
V. A. Abalmassov and F. Marquardt, Phys. Rev. B 70, 075313 (2004) [http://prb.aps.org/abstract/PRB/v70/i7/e075313 Journal] [[media:2004_08_V.A.Abalmassov and F. Marquardt_PhononAssistedHyperfineRelaxation.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015037 Cite]<br />
<br />
== 2003 ==<br />
<br />
* '''Dephasing in sequential tunneling through a double-dot interferometer''' <br />
F. Marquardt and C. Bruder, Phys. Rev. B 68, 195305 (2003) [http://prb.aps.org/abstract/PRB/v68/i19/e195305 Journal] [[media:2003_11_DoubleDotInterferometer_SequentialTunneling.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015044 Cite]<br />
<br />
== 2002 ==<br />
<br />
* '''Non-Markoffian effects of a simple nonlinear bath''' <br />
H. Gassmann, F. Marquardt, and C. Bruder, Phys. Rev. E 66, 041111 (2002) [http://pre.aps.org/abstract/PRE/v66/i4/e041111 Journal] [[media:2002_10_GassmannEtAl_NonMarkoffianEffectsNonlinearBath.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015108 Cite]<br />
* '''Separation quality of a geometric ratchet''' <br />
C. Keller, F. Marquardt, and C. Bruder, Phys. Rev. E 65, 041927 (2002) [http://pre.aps.org/abstract/PRE/v65/i4/e041927 Journal] [[media:2002_04_KellerEtAl_RatchetSeparationQuality.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015050 Cite]<br />
* '''Visibility of the Aharonov-Bohm effect in a ring coupled to a fluctuating magnetic flux''' <br />
F. Marquardt and C. Bruder, Journal Of Low Temperature Physics 126, 1325--1337 (2002) [http://www.springerlink.com/content/w367082526806275/?p=3c0e1139d17c429ead079955f0c321d7&pi=34 Journal]<br />
[[media:2002_02_F.Marquardt_VisibilityABeffectRingFluctuatingMagneticFlux.pdf|PDF]] [http://www.citeulike.org/group/13403/article/7112809 Cite]<br />
* '''Aharonov-Bohm ring with fluctuating flux''' <br />
F. Marquardt and C. Bruder, Phys. Rev. B 65, 125315 (2002) [http://prb.aps.org/abstract/PRB/v65/i12/e125315 Journal] [[media:2002_03_F.Marquardt_ABring.pdf|PDF]] [http://www.citeulike.org/group/13403/article/3015052 Cite]<br />
<br />
== 2001 ==<br />
<br />
* '''Superposition of two mesoscopically distinct quantum states: Coupling a Cooper-pair box to a large superconducting island''' <br />
F. Marquardt and C. Bruder, Phys. Rev. B 63, 054514 (2001) [http://prb.aps.org/abstract/PRB/v63/i5/e054514 Journal] [[media:2001_01_CooperPairBox.pdf|PDF]] [http://www.citeulike.org/user/FlorianMarquardt/article/3014241 Cite]</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=File:Arxiv.1703.08191.pdf&diff=5138File:Arxiv.1703.08191.pdf2017-03-30T11:25:23Z<p>ThomasFoesel: "L lines, C points and Chern numbers: understanding band structure topology using polarization fields"
Thomas Fösel, Vittorio Peano, Florian Marquardt
Comments: 16 pages, 5 figures; presented at the APS March Meeting 2017 (session A15)</p>
<hr />
<div>"L lines, C points and Chern numbers: understanding band structure topology using polarization fields"<br />
Thomas Fösel, Vittorio Peano, Florian Marquardt<br />
Comments: 16 pages, 5 figures; presented at the APS March Meeting 2017 (session A15)</div>ThomasFoeselhttp://theorie2.physik.uni-erlangen.de/index.php?title=Seminars&diff=4353Seminars2016-02-08T15:58:30Z<p>ThomasFoesel: </p>
<hr />
<div>== External seminars ==<br />
<br />
* [http://www.physik.uni-erlangen.de/aktuelles/physikalisches-kolloquium.shtml The physics "Kolloquium" on Mondays at 16:15]<br />
* [[Kolloquium der Theoretischen_Physik]] on Tuesdays, 16-18h in lecture theatre F]<br />
* [http://www.thcp.physik.uni-erlangen.de/Seminars/index.htm Seminars on Theoretical Chemical Physics (group of Prof. Thoss), on Thursdays, 16-18h]<br />
* [http://mpl.mpg.de/mpf/php/index.php/en/Seminare/Diese_Woche/ This week's seminars at the Max Planck Institute for the Science of Light]<br />
<br />
== Seminar on nanophysics and quantum optics ==<br />
<br />
=== Winter term 2015/16 ===<br />
'''''Tuesdays, starting at 14:00, Seminar room 01.332''''' <br />
* 20.10. <span style="color:darkgreen;"> Silvia Viola Kusminskiy</span>: "Current induced forces in nano-electromechanical systems"<br />
* 03.11. <span style="color:darkgreen;"> Carlos Navarrete-Benlloch</span>: "Degenerate parametric oscillation in membrane quantum optomechanics"<br />
* 10.11. <span style="color:darkgreen;">Stefan Walter</span>: "Dynamical Gauge Fields in Optomechanics"<br />
* 17.11. <span style="color:darkgreen;">Manuel Pfeuffer</span>: "Nonlinear stochastic pattern formation in optomechanical arrays"<br />
* 24.11. <span style="color:darkgreen;">Jose Alonso</span>: "Thermodynamics of weakly measured quantum systems"<br />
* 01.12. <span style="color:darkgreen;">Roland Lauter</span>: " Stochastic dynamics in optomechanical oscillator arrays"<br />
* 08.12. <span style="color:darkgreen;">Florian Endres</span>: "Quantum stochastic heat engines"<br />
* 15.12. <span style="color:darkgreen;">Christian Brendel</span>: "Introduction to COMSOL" <br />
* 12.01. <span style="color:darkgreen;">Sadegh Raeisi</span><br />
* 19.01. <span style="color:darkgreen;">Chau Nguyen ''(MPIKS Dresden)''</span>:"The pure-state outcomes of quantum steering"<br />
* 26.01. <span style="color:darkgreen;">Jan Korbel ''(Technical University Prague)''</span>: "Generalized entropies: what are they good for?"<br />
* 02.02. <span style="color:darkgreen;">Kevin Jaksch</span><br />
* 08.02. <span style="color:darkgreen;">Thomas Fösel</span>: "Topological States in a Mechanical System"<br />
* Koppany Kormoczi<br />
*EL-Bachelor 1<br />
*EL-Bachelor 2<br />
*EL-Bachelor 3<br />
*Vittorio Peano<br />
<br />
=== Summer term 2015 ===<br />
Tuesdays, starting at 14:15, Seminar room 02.779 (Staudtstr. 7, building B3, Erlangen)<br />
* 11.05.: Andreas Dechan "Generalized Green-Kubo relation and application to scale-invariant dynamics"<br />
* 18.05.: Sadegh Raeisi<br />
* 02.06.: Michael’s practice talk<br />
* 09.06.: tba<br />
* 16.06.: Stefan Walter "Introduction to Lattice Gauge Theory"<br />
* 23.06.: Alexander Friedenberger<br />
* 30.06.: Thales Roque<br />
<br />
=== Archive===<br />
Earlier semiars: '''[[Archive]]'''<br />
<br />
== Optomechanics seminar ==<br />
'''''Wednesdays, starting at 11:00, seminar room SR 02.729'''''<br />
=== Winter term 2015/2016 ===<br />
* 14.10. Taofiq Paraiso<br />
* 21.10. Roland Lauter<br />
* 28.10. Koppany Kormoczi<br />
* 04.11. Sadegh Raeisi<br />
* 11.11. Talitha Weiß<br />
* 18.11. Stefan Walter<br />
* 25.11. Vittorio Peano<br />
* 02.12. Hannes Pfeifer<br />
* 09.12. Silvia Viola Kusminskiy<br />
* 16.12. Christian Brendel<br />
* 13.01. Roland Lauter<br />
* 20.01. Koppany Kormoczi<br />
* 27.01. Sadegh Raeisi<br />
* 03.02. Vittorio Peano<br />
* '''10.02. <span style="color:darkgreen;">Talitha Weiß</span>'''<br />
* 17.02. Taofiq Paraiso<br />
* 24.02. Stefan Walter<br />
* 02.03. Hannes Pfeifer<br />
* 23.03. Christian Brendel <br />
* 30.03. Silvia Viola Kusminskiy''<br />
<br />
=== Summer term 2015 ===<br />
* 15.04. Hannes Pfeifer<br />
* 22.04. Vittorio Peano<br />
* 29.04. Stefan Walter<br />
* 06.05. Sadegh Raeisi<br />
* 13.05. Thales Figueiredo Roque<br />
* 20.05.<br />
* 27.05. Taofiq Paraiso<br />
* 03.06. Christian Brendel<br />
* 10.06. Talitha Weiß<br />
* 17.06. Roland Lauter<br />
* 24.06. Leyun Zang<br />
* 01.07. Hannes Pfeifer<br />
* 08.07. <br />
* 15.07. Stefan Walter<br />
* 22.07. Vittorio Peano<br />
* 29.07. Sadegh Raeisi<br />
* 05.08. Christian Brendel<br />
* 12.08. Taofiq Paraiso<br />
* 19.08. Roland Lauter<br />
* 02.09. Talitha Weiß<br />
* 09.09. Thales Figueiredo Roque<br />
* 16.09. Stefan Walter<br />
* 23.09. Hannes Pfeifer<br />
* 30.09. Vittorio Peano<br />
* 07.10. Christian Brendel<br />
<br />
=== Archive===<br />
Earlier semiars: '''[[Optomechanics Archive]]'''</div>ThomasFoesel