Merge pull request #19 from darkblue-b/sci_upd

scitools courses refresh 01jul17
intro16
Brian M Hamlin 2017-07-01 15:07:19 -07:00 committed by GitHub
commit 155abc53c2
22 changed files with 2764 additions and 2993 deletions

5
SciTools-courses/.gitignore vendored Normal file
View File

@ -0,0 +1,5 @@
course_content/simple.svg
.ipynb_checkpoints
.project
.pydevproject

View File

@ -0,0 +1,37 @@
# The language in this case has no bearing - we are going to be making use of conda for a
# python distribution for the scientific python stack.
language: python
sudo: false
env:
global:
- CONDA_INSTALL_LOCN="${HOME}/conda"
# Generated with `$ travis encrypt GH_TOKEN=<auth_token>` for SciTools/courses.
- secure: "IGrdEKM5CH5OCus7pCCrzSWV9WOEuY9QKvWFbNBkBLa3MNszvZop3oioB3QVvygHvMV2HvcOBEx4KaQvUt6UanNb8j+C1zrPGWBuDeuWVDC3pCfqCsXRGXJU1TJp0alRByxaSoenvSjEBJi/GpF9zsAosS/wJIX1rBCbLsjKrhU="
matrix:
- PYTHON=3.4
install:
- wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ${HOME}/miniconda.sh
- bash ${HOME}/miniconda.sh -b -p ${CONDA_INSTALL_LOCN} && export PATH=${CONDA_INSTALL_LOCN}/bin:$PATH
- git config --global user.name "Travis user"
- git config --global user.email travis-tests@example.com
# Now do the things we need to do to install it.
- conda install notebook nbconvert python=${PYTHON} --yes --quiet
- rm -rf build
- git clone https://dkillick:${GH_TOKEN}@github.com/SciTools/courses.git --branch build build
script:
- ./make.sh
- cd build
- git add -A --ignore-removal .
- git status
# This will only fail if there's nothing to commit, so we can safely skip over this failure.
- git commit -am "Build of ${TRAVIS_COMMIT}" || true
- if [ ! -z "$GH_TOKEN" ]; then
git push origin build > /dev/null;
fi

View File

@ -1,5 +1,16 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"%matplotlib inline"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -20,7 +31,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
"collapsed": false
},
"outputs": [],
"source": [
@ -48,6 +59,13 @@
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"A full list of Cartopy projections is available at http://scitools.org.uk/cartopy/docs/latest/crs/projections.html."
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -137,9 +155,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"----\n",
"\n",
"**Exercise 1:**\n",
"### Exercise 1\n",
"\n",
"The following snippet of code produces coordinate arrays and some data in a rotated pole coordinate system. The coordinate system for the `x` and `y` values, which is similar to that found in the some limited area models of Europe, has a projection \"north pole\" at 177.5 longitude and 37.5 latitude."
]
@ -148,7 +164,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
"collapsed": false
},
"outputs": [],
"source": [
@ -167,14 +183,16 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"1\\. Define a cartopy coordinate reference system which represents a rotated pole with a pole latitude of 37.5 and a pole longitude of 177.5."
"**Part 1**\n",
"\n",
"Define a cartopy coordinate reference system which represents a rotated pole with a pole latitude of 37.5 and a pole longitude of 177.5."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
"collapsed": false
},
"outputs": [],
"source": []
@ -183,14 +201,16 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Produce a map, with coastlines, using the coordinate reference system created in #1."
"** Part 2**\n",
"\n",
"Produce a map, with coastlines, using the coordinate reference system created in Part 1."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
"collapsed": false
},
"outputs": [],
"source": []
@ -199,21 +219,22 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"3\\. Produce a map, with coastlines, in a Plate Carree projection with a pcolormesh of the data generated by the code snippet provided at the beginning of the example. Remember that the data is supplied in the rotated coordinate system defined in #1."
"**Part 3**\n",
"\n",
"Produce a map, with coastlines, in a Plate Carree projection with a pcolormesh of the data generated by the code snippet provided at the beginning of the exercise. Remember that the data is supplied in the rotated coordinate system defined in Part 1."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
"collapsed": false
},
"outputs": [],
"source": []
}
],
"metadata": {
"hide_input": false,
"kernelspec": {
"display_name": "Python 2",
"language": "python",
@ -229,20 +250,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.11+"
},
"latex_envs": {
"bibliofile": "biblio.bib",
"cite_by": "apalike",
"current_citInitial": 1,
"eqLabelWithNumbers": true,
"eqNumInitial": 0
},
"widgets": {
"state": {},
"version": "0.3.0"
"version": "2.7.12"
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 0
}

View File

@ -1,5 +1,16 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"%matplotlib inline"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -456,7 +467,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 1:\n",
"### Exercise 1\n",
"\n",
"1\\. Using the file in ``iris.sample_data_path('atlantic_profiles.nc')`` load the data and print the cube list. Store these cubes in a variable called cubes."
]
@ -582,14 +593,29 @@
"source": [
"#### Note on sample_data_path:\n",
"\n",
"Throughout this course we will make use of the sample data that Iris provides. The function ``iris.sample_data_path`` returns the appropriate path to the file in the Iris sample data collection. A common mistake for Iris users is to use the ``sample_data_path`` function to access data that is not part of Iris's sample data collection - this is bad practice and is unlikely to work in the future.\n",
"Throughout this course we will make use of the sample data that Iris provides. The function ``iris.sample_data_path`` returns the appropriate path to the file in the Iris sample data collection. A common mistake for Iris users is to use the ``sample_data_path`` function to access data that is not part of Iris's sample data collection - this is bad practice and is unlikely to work in the future."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Exercise 2\n",
"\n",
"**Exercise 2:**\n",
"Print the result of ``iris.sample_data_path('uk_hires.pp')`` to verify that it returns a string pointing to a file on your system. Use this string directly in the call to ``iris.load`` and confirm the result is the same as in the previous example e.g.:\n",
"\n",
" print iris.load('/path/to/iris/sampledata/uk_hires.pp', 'air_potential_temperature')\n"
" print iris.load('/path/to/iris/sampledata/uk_hires.pp', 'air_potential_temperature')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
@ -631,9 +657,16 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The load functions all accept a list of filenames to load, and any of the filenames can be \"glob\" patterns (http://docs.python.org/2/library/glob.html).\n",
"The load functions all accept a list of filenames to load, and any of the filenames can be \"glob\" patterns (http://docs.python.org/2/library/glob.html)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Exercise 2 (continued)\n",
"\n",
"**Exercise 2 continued:** Read in the files found at **``iris.sample_data_path('GloSea4', 'ensemble_010.pp')``** and\n",
"Read in the files found at **``iris.sample_data_path('GloSea4', 'ensemble_010.pp')``** and\n",
"**``iris.sample_data_path('GloSea4', 'ensemble_011.pp')``** using a single load call. Do this by:\n",
"\n",
"1\\. providing a list of the two filenames."
@ -821,6 +854,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Time Constraints\n",
"\n",
"It is common to want to build a constraint for time. With Iris < v1.6 it was harder to build time constraints than we would have liked, because of the way that time coordinates had been implemented.\n",
"\n",
"However, since v1.6 this has been made simpler through the ability to compare against cells containing datetimes. The functionality can be enabled globally within the session (and will be enabled by default in future release of Iris) with:"
@ -841,7 +876,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"With this set, it is now possible to do the same constraint by simply:"
"We can now make time constraints as follows:"
]
},
{
@ -861,7 +896,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"There are currently still some limitations however. For example, it is not yet possible to do cell based datetime comparisons when the datetime is from anything other than a Gregorian calendar (e.g. such as the 360-day calendar often used in climate models). When this is the case however, we can always access individual components of the datetime and do comparisons on those:"
"There are currently still some limitations. For example, it is not yet possible to do cell based datetime comparisons when the datetime is from anything other than a Gregorian calendar (e.g. such as the 360-day calendar often used in climate models). When this is the case however, we can always access individual components of the datetime and do comparisons on those:"
]
},
{
@ -901,14 +936,14 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 3:**\n",
"### Exercise 3\n",
"\n",
"1\\. The following function tells us whether or not a cube has cell methods:\n",
"The following function tells us whether or not a cube has cell methods:\n",
"\n",
" def has_cell_methods(cube):\n",
" return len(cube.cell_methods) > 0\n",
"\n",
"With the cubes loaded from ``[iris.sample_data_path('A1B_north_america.nc'), iris.sample_data_path('uk_hires.pp')]`` use the CubeList's **``extract``** method to filter only the cubes that have cell methods. (Hint: Look at the ``iris.Constraint`` documentation for the **cube_func** keyword). You should find that the 3 cubes are whittled down to just 1."
"1\\. With the cubes loaded from ``[iris.sample_data_path('A1B_north_america.nc'), iris.sample_data_path('uk_hires.pp')]`` use the CubeList's **``extract``** method to filter only the cubes that have cell methods. (Hint: Look at the ``iris.Constraint`` documentation for the **cube_func** keyword). You should find that the 3 cubes are whittled down to just 1."
]
},
{
@ -1228,7 +1263,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**1\\.** Identify and resolve the issue preventing ``resources/merge_exercise.1.*.nc`` from merging.\n",
"1\\. Identify and resolve the issue preventing ``resources/merge_exercise.1.*.nc`` from merging.\n",
"\n",
" >>> raw_cubes = iris.load_raw('../resources/merge_exercise.1.*.nc')\n",
" >>> raw_cubes.merge_cube()\n",
@ -1263,7 +1298,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**2\\.** Identify and resolve the issue preventing ``resources/merge_exercise.2.*.nc`` from merging.\n",
"2\\. Identify and resolve the issue preventing ``resources/merge_exercise.2.*.nc`` from merging.\n",
"\n",
" >>> raw_cubes = iris.load_raw('../resources/merge_exercise.2.*.nc')\n",
" >>> raw_cubes.merge_cube()\n",
@ -1285,7 +1320,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**3\\.** (extension) Identify and resolve the issue preventing ``resources/merge_exercise.4.*.nc`` from merging.\n",
"3\\. (extension) Identify and resolve the issue preventing ``resources/merge_exercise.4.*.nc`` from merging.\n",
"\n",
" >>> raw_cubes = iris.load_raw('../resources/merge_exercise.4.*.nc')\n",
" >>> raw_cubes.merge_cube()\n",
@ -1307,7 +1342,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**4\\.** (extension) Identify and resolve the issue preventing ``resources/merge_exercise.5.*.nc`` from merging (hint: Cubes can be indexed like NumPy arrays).\n",
"4\\. (extension) Identify and resolve the issue preventing ``resources/merge_exercise.5.*.nc`` from merging (hint: Cubes can be indexed like NumPy arrays).\n",
"\n",
" >>> raw_cubes = iris.load_raw('../resources/merge_exercise.5.*.nc')\n",
" >>> raw_cubes.merge_cube()\n",
@ -1557,7 +1592,25 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 5:** What other aggregators are available? Calculate the potential temperature variance with time for the area averaged cube (hint: We want to reduce the vertical dimension, and end up with a cube of length 3). Print the data values of the resulting cube."
"### Exercise 5 ###\n",
"\n",
"1\\. What other aggregators are available?"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Calculate the potential temperature variance with time for cube `area_avg` from above. Hint: We want to reduce the area averaged cube's vertical dimension, and end up with a cube of length 3. Print the data values of the resulting cube."
]
},
{
@ -1682,7 +1735,6 @@
"\n",
"Iris comes with two plotting modules called ``iris.plot`` and ``iris.quickplot`` that wrap some of the common matplotlib plotting functions such that cubes can be passed as input rather than the usual NumPy arrays. The two modules are very similar, with the primary difference being that ``quickplot`` will add extra information to the axes, such as:\n",
"\n",
" * an appropriate colour map,\n",
" * a colorbar,\n",
" * labels for the x and y axes, and\n",
" * a title where possible."
@ -1747,7 +1799,7 @@
"plt.subplot(1, 2, 2)\n",
"qplt.plot(ts)\n",
"\n",
"plt.subplots_adjust(hspace=0.5)\n",
"plt.subplots_adjust(wspace=0.5)\n",
"plt.show()"
]
},
@ -1838,7 +1890,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 6:** Use the above cube with appropriate slicing, to produce the following:\n",
"### Exercise 6\n",
"\n",
"Use the above cube, with appropriate indexing, to produce the following:\n",
"\n",
"1\\. a **contour** plot of *time* vs *longitude*"
]
@ -1956,7 +2010,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice that the resultant cube's name is now unknown and that the coordinates “time” and “forecast_period” have been removed; this is because these coordinates differed between the two input cubes."
"Notice that the resultant cube's name is now `unknown` and that resultant cube's `attributes` and `cell methods` have disappeared; this is because these all differed between the two input cubes."
]
},
{
@ -2022,7 +2076,7 @@
"source": [
"## Creating extra annotation coordinates for statistical convenience\n",
"\n",
"Sometimes we want to be able to categorise data before performing statistical operations on it. For example, \"daylight maximum\" and \"seasonal mean\" etc., with \"daylight\" and \"seasonal\" being categorised based, in this case, on the time coordinate.\n",
"Sometimes we want to be able to categorise data before performing statistical operations on it. For example, we might want to categorise our data by \"daylight maximum\" or \"seasonal mean\" etc. Both of these categorisations would be based on the time coordinate.\n",
"\n",
"The ``iris.coord_categorisation`` module provides convenience functions to add some common categorical coordinates, and provides a generalised function to allow each creation of custom categorisations. "
]
@ -2103,7 +2157,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Custom categorisation can be created with an arbitrary function. For example, the already existing ``add_year`` categorisor takes the 'time' coordinate, and creates a 'year' coordinate. This could be achieved without using the available ``add_year`` by:"
"### Custom categorisation ###\n",
"\n",
"Custom categorisation can be achieved with an arbitrary function. For example, the existing ``add_year`` categorisor takes the 'time' coordinate, and creates a 'year' coordinate. This could be achieved without using the available ``add_year`` by:"
]
},
{
@ -2282,9 +2338,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 7\n",
"### Exercise 7\n",
"\n",
"1. Load 'A1B_north_america.nc' from the Iris sample data."
"Produce a set of plots that provide a comparison of decadal-mean air temperatures over North America:\n",
"\n",
"**Part 1**\n",
"\n",
"Load 'A1B_north_america.nc' from the Iris sample data."
]
},
{
@ -2300,7 +2360,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Extract just data from the year 1980 and beyond from the loaded data."
"**Part 2**\n",
"\n",
"Extract just data from the year 1980 and beyond from the loaded data."
]
},
{
@ -2316,7 +2378,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"3\\. Define a function that takes a coordinate and a single time point as arguments, and returns the decade. For example, your function should return 2010 for the following:\n",
"**Part 3**\n",
"\n",
"Define a function that takes a coordinate and a single time point as arguments, and returns the decade. For example, your function should return 2010 for the following:\n",
"\n",
" time = iris.coords.DimCoord([10], 'time',\n",
" units='days since 2018-01-01')\n",
@ -2336,7 +2400,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"4\\. Add a \"decade\" coordinate to the loaded cube using your function and the coord categorisation module."
"**Part 4**\n",
"\n",
"Add a \"decade\" coordinate to the loaded cube using your function and the coord categorisation module."
]
},
{
@ -2352,7 +2418,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"5\\. Calculate the decadal means cube for this scenario."
"**Part 5**\n",
"\n",
"Calculate the decadal means cube for this scenario."
]
},
{
@ -2368,7 +2436,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"6\\. Create a figure with 3 rows and 4 columns displaying the decadal means, with the decade displayed prominently in each axes' title."
"**Part 6**\n",
"\n",
"Create a figure with 3 rows and 4 columns displaying the decadal means, with the decade displayed prominently in each axes' title."
]
},
{
@ -2382,7 +2452,6 @@
}
],
"metadata": {
"hide_input": false,
"kernelspec": {
"display_name": "Python 2",
"language": "python",
@ -2398,20 +2467,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.11+"
},
"latex_envs": {
"bibliofile": "biblio.bib",
"cite_by": "apalike",
"current_citInitial": 1,
"eqLabelWithNumbers": true,
"eqNumInitial": 0
},
"widgets": {
"state": {},
"version": "0.3.0"
"version": "2.7.12"
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 0
}

View File

@ -1,5 +1,16 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"%matplotlib inline"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -26,11 +37,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## The matplotlib figure\n",
"## The matplotlib Figure\n",
"\n",
"At the heart of **every** matplotlib plot is the \"Figure\" object. The \"Figure\" object is the top level concept that can be drawn to one of the many output formats, or simply just to screen. Any object that can be drawn in this way is known as an \"Artist\" in matplotlib.\n",
"At the heart of every matplotlib plot is the \"Figure\". The Figure is the top level concept that can be drawn to one of the many output formats, or simply just to screen.\n",
"\n",
"Lets create our first artist using pyplot, and then show it:"
"Let's create our first Figure using pyplot, and then show it:"
]
},
{
@ -49,11 +60,12 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"On its own, drawing the figure artist is uninteresting and will result in an empty piece of paper (that's why we didn't see anything above).\n",
"On its own, drawing the Figure is uninteresting and will result in an empty piece of paper (that's why we didn't see anything above). \n",
"Other visible elements are added to a Figure to make a plot. All visible items in Matplotlib are instances of the [Artist](http://matplotlib.org/api/artist_api.html#artist-class) class : The Figure and Axes are both types of Artist.\n",
"\n",
"By far the most useful artist in matplotlib is the \"Axes\" artist. The Axes artist represents the \"data space\" of a typical plot. A rectangular axes (the most common axes, but not the only axes, e.g. polar plots) will have two (confusingly named) Axis Artists with tick labels and tick marks.\n",
"To start with we can draw an [Axes](http://matplotlib.org/api/axes_api.html) artist in the Figure, to represent our data space. The most basic Axes is rectangular and has tick labels and tick marks. Multiple Axes artists can be placed on a Figure.\n",
"\n",
"There is no limit on the number of Axes artists that can exist on a Figure artist. Let's go ahead and create a figure with a single Axes Artist, and show it using pyplot:"
"Let's go ahead and create a Figure with a single Axes, and show it using pyplot:"
]
},
{
@ -72,23 +84,30 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Matplotlib's ``pyplot`` module makes the process of creating graphics easier by allowing us to skip some of the tedious Artist construction. For example, we did not need to manually create the Figure artist with ``plt.figure`` because it was implicit that we needed a figure when we created the Axes artist.\n",
"Matplotlib's ``pyplot`` module makes the process of creating graphics easier by allowing us to skip some of the tedious object construction. For example, we did not need to manually create the Figure with ``plt.figure`` because it was implicit that we needed a Figure when we created the Axes.\n",
"\n",
"Under the hood matplotlib still had to create a Figure artist; we just didn't need to capture it into a variable. We can access the created object with the \"state\" functions found in pyplot called **``gcf``** and **``gca``**.\n",
"\n",
"**Exercise 1:**\n",
"\n",
" * Go to matplotlib.org and search for what these strangely named functions do.\n",
" You will find multiple results so remember we are looking for the ``pyplot`` versions of these functions."
"Under the hood matplotlib still had to create a Figure; we just didn't need to capture it into a variable. We can access the created object with the \"state\" functions found in pyplot called **``gcf``** and **``gca``**."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Working with the axes\n",
"### Exercise 1\n",
"\n",
"Go to matplotlib.org and search for what these strangely named functions do.\n",
"\n",
"Hint: you will find multiple results so remember we are looking for the ``pyplot`` versions of these functions."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Working with the Axes\n",
"\n",
"As has already been mentioned, most of your time building a graphic in matplotlib will be spent on the Axes. Whilst the matplotlib documentation for the Axes is very detailed, it is also rather difficult to navigate (though this is an area of ongoing improvement).\n",
"\n",
"As has already been mentioned, most of your time building a graphic in matplotlib will be spent on the Axes artist. Whilst the matplotlib documentation for the Axes artist is very detailed, it is also rather difficult to navigate (though this is an area of ongoing improvement).\n",
"As a result, it is often easier to find new plot types by looking at the pyplot module's documentation.\n",
"\n",
"The first and most common Axes method is ``plot``. Go ahead and look at the ``plot`` documentation from the following sources:\n",
@ -117,14 +136,24 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice how the axes view limits (``ax.viewLim``) have been updated to include the whole of the line.\n",
"Should we want to add some spacing around the edges of our axes we could set the axes margin using the Axes artist's [``margins``](http://matplotlib.org/api/axes_api.html?highlight=axes#matplotlib.axes.Axes.margins) method. Alternatively, we could manually set the limits with the Axes artist's [``set_xlim``](http://matplotlib.org/api/axes_api.html?#matplotlib.axes.Axes.set_xlim) and [``set_ylim``](http://matplotlib.org/api/axes_api.html?#matplotlib.axes.Axes.set_ylim) methods.\n",
"Notice how the Axes view limits (``ax.viewLim``) have been updated to include the whole of the line.\n",
"Should we want to add some spacing around the edges of our Axes we can set a margin using the [``margins``](http://matplotlib.org/api/axes_api.html?highlight=axes#matplotlib.axes.Axes.margins) method. Alternatively, we can manually set the limits with the [``set_xlim``](http://matplotlib.org/api/axes_api.html?#matplotlib.axes.Axes.set_xlim) and [``set_ylim``](http://matplotlib.org/api/axes_api.html?#matplotlib.axes.Axes.set_ylim) methods."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Exercise 2\n",
"\n",
"------\n",
"\n",
"**Exercise 2:** Modify the previous example to produce three different figures that control the limits of the axes by:\n",
"\n",
"1\\. Manually setting the x and y limits to $[0.5, 2]$ and $[1, 5]$ respectively."
"Modify the previous example to produce three different Figures that control the limits of the Axes."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"1\\. Manually set the x and y limits to $[0.5, 2]$ and $[1, 5]$ respectively."
]
},
{
@ -140,7 +169,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
" 2\\. Defining a margin such that there is 10% whitespace inside the axes around the drawn line (Hint: numbers to margins are normalised such that 0% is 0.0 and 100% is 1.0)."
" 2\\. Define a margin such that there is 10% whitespace inside the axes around the drawn line (Hint: numbers to margins are normalised such that 0% is 0.0 and 100% is 1.0)."
]
},
{
@ -156,7 +185,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"3\\. Setting a 10% margin on the axes with the lower y limit set to 0. (Note: order is important here)"
"3\\. Set a 10% margin on the Axes with the lower y limit set to 0. (Note: order is important here)"
]
},
{
@ -174,7 +203,7 @@
"source": [
"--------------\n",
"\n",
"In truth, the previous example can be simplified to be even shorter. First, we are not using the returned artists, so we could avoid the assignment and just call the appropriate functions. Second, in exactly the same way that we didn't *need* to manually create a Figure artist when using the ``pyplot.axes`` method, we can remove the ``plt.axes`` if we use the ``plot`` function from ``pyplot``. Our simple line example then becomes:"
"If we want to create a plot in its simplest form, without any modifications to the Figure or Axes, we can leave out the creation of artist variables. Our simple line example then becomes:"
]
},
{
@ -193,16 +222,16 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The simplicity of this example shows how visualisations can be produced quickly and easily with matplotlib, but it is worth remembering that for full control of Figure and Axes artists we can mix the convenience of ``pyplot`` with the power of matplotlib's object oriented design."
"The simplicity of this example shows how visualisations can be produced quickly and easily with matplotlib, but it is worth remembering that for full control of the Figure and Axes artists we can mix the convenience of ``pyplot`` with the power of matplotlib's object oriented design."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"--------\n",
"### Exercise 3\n",
"\n",
"**Exercise 3:** By calling ``plot`` multiple times, create a single axes showing the line plots of $y=sin(x)$ and $y=cos(x)$ in the interval $[0, 2\\pi]$ with 200 linearly spaced $x$ samples."
"By calling ``plot`` multiple times, create a single Axes showing the line plots of $y=sin(x)$ and $y=cos(x)$ in the interval $[0, 2\\pi]$ with 200 linearly spaced $x$ samples."
]
},
{
@ -220,9 +249,12 @@
"source": [
"--------\n",
"\n",
"## Multiple axes on the same figure (aka subplot)\n",
"## Multiple Axes on the same Figure (aka subplot)\n",
"\n",
"Matplotlib makes it relatively easy to add more than one Axes artist to a figure. The ``add_subplot`` method on a Figure artist, which is wrapped by the ``subplot`` function in ``pyplot``, adds an Axes artist in the grid position specified. To compute the position, we must tell matplotlib the number of rows and columns to separate the figure into, and which number the axes to be created is (1 based). For example, to create axes at the top right and bottom left of a $3 x 2$ notional grid of Axes artists the grid specifications would be ``2, 3, 3`` and ``2, 3, 4`` respectively:"
"Matplotlib makes it relatively easy to add more than one Axes object to a Figure. The ``Figure.add_subplot()`` method, which is wrapped by the ``pyplot.subplot()`` function, adds an Axes in the grid position specified. To compute the position, we tell matplotlib the number of rows and columns (respectively) to divide the figure into, followed by the index of the axes to be created (1 based). \n",
"\n",
"For example, to create an axes grid with two columns, the grid specification would be ``plt.subplot(1, 2, <plot_number>)``. \n",
"The left-hand Axes is plot number 1, created with ``subplot(1, 2, 1)``, and the right-hand one is number 2, ``subplot(1, 2, 2)`` :"
]
},
{
@ -233,9 +265,40 @@
},
"outputs": [],
"source": [
"top_right_ax = plt.subplot(2, 3, 3)\n",
"bottom_left_ax = plt.subplot(2, 3, 4)\n",
"ax_left = plt.subplot(1, 2, 1)\n",
"plt.plot([2,1,3,4])\n",
"plt.title('left = #1')\n",
"\n",
"ax_left = plt.subplot(1, 2, 2)\n",
"plt.plot([4,1,3,2])\n",
"plt.title('right = #2')\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Likewise, for plots above + below one another we would use *two* rows and *one* column, as in ``subplot(2, 1, <plot_number>)``."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's expand our grid to two rows and three columns, and place one set of axes on the top right (grid specification ``(2, 3, 3)``) and another on the bottom left (grid specification ``(2, 3, 4)``)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"top_right_ax = plt.subplot(2, 3, 3, title='#3 = top-right')\n",
"bottom_left_ax = plt.subplot(2, 3, 4, title='#4 = bottom-left')\n",
"plt.show()"
]
},
@ -245,7 +308,7 @@
"source": [
"--------\n",
"\n",
"**Exercise 3 continued:** Copy the answer from the previous task (plotting $y=sin(x)$ and $y=cos(x)$) and add the appropriate ``plt.subplot`` calls to create a figure with two rows of Axes artists, one showing $y=sin(x)$ and the other showing $y=cos(x)$."
"**Exercise 3 continued:** Copy the answer from the previous task (plotting $y=sin(x)$ and $y=cos(x)$) and add the appropriate ``plt.subplot`` calls to create a Figure with two rows of Axes, one showing $y=sin(x)$ and the other showing $y=cos(x)$."
]
},
{
@ -371,7 +434,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Titles, Legends, colorbars and annotations"
"## Titles, legends, colorbars and annotations"
]
},
{
@ -385,7 +448,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The ``suptitle`` pyplot function allows us to set the title of a figure, and the ``set_title`` method on an Axes artist allows us to set the title of an individual axes. Additionally Axes artists have methods named ``set_xlabel`` and ``set_ylabel`` to label the respective x and y Axis artists (that's Axis, not Axes). Finally, we can add text, located by data coordinates, with the ``text`` method on an Axes artist:"
"The ``suptitle`` pyplot function allows us to set the title of a Figure, and the ``set_title`` method on an Axes allows us to set the title of an individual Axes. Additionally, an Axes has methods named ``set_xlabel`` and ``set_ylabel`` to label the respective x and y axes. Finally, we can add text, located by data coordinates, with the Axes ``text`` method:"
]
},
{
@ -430,7 +493,7 @@
"source": [
"x = np.linspace(-3, 7, 200)\n",
"plt.plot(x, 0.5*x**3 - 3*x**2, linewidth=2,\n",
" label='$f(x)=0.5x^2-3x^2$')\n",
" label='$f(x)=0.5x^3-3x^2$')\n",
"plt.plot(x, 1.5*x**2 - 6*x, linewidth=2, linestyle='--',\n",
" label='Gradient of $f(x)$', )\n",
"plt.legend(loc='lower right')\n",
@ -496,7 +559,7 @@
"source": [
"## Savefig & backends\n",
"\n",
"Matplotlib allows you to specify a \"backend\" to drive rendering the figure. The backend includes the graphical user interface (GUI) library to use, and the most used backend (as it is normally the default one) is the \"TkAgg\" backend. When ``plt.show()`` is called, this backend pops up a figure in a new TkInter window, which is rendered by the anti-grain graphics library (also known as \"agg\"). Generally, the most common reason to want to change backends is for automated figure production on a headless server. In this situation, the \"agg\" backend can be used:\n",
"Matplotlib allows you to specify a \"backend\" to drive rendering the Figure. The backend includes the graphical user interface (GUI) library to use, and the most used backend (as it is normally the default one) is the \"TkAgg\" backend. When ``plt.show()`` is called, this backend pops up a Figure in a new TkInter window, which is rendered by the anti-grain graphics library (also known as \"agg\"). Generally, the most common reason to want to change backends is for automated Figure production on a headless server. In this situation, the \"agg\" backend can be used:\n",
"\n",
" import matplotlib\n",
" matplotlib.use('agg')\n",
@ -504,9 +567,9 @@
" \n",
"Note: The backend must be chosen before importing pyplot for the first time, unless the ``force`` keyword is added.\n",
"\n",
"Non-interactive backends such as the \"agg\" backend will do nothing when **``plt.show()``** is called - this is because there is nowhere (no graphical display) specified for a figure to be displayed.\n",
"Non-interactive backends such as the \"agg\" backend will do nothing when **``plt.show()``** is called - this is because there is nowhere (no graphical display) specified for a Figure to be displayed.\n",
"\n",
"To save a figure programmatically the ``savefig`` function can be used from _any_ backend:"
"To save a Figure programmatically the ``savefig`` function can be used from _any_ backend:"
]
},
{
@ -525,14 +588,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"For graphical backends, showing and subsequently closing the window of a figure results in the figure being cleared from the matplotlib system. This is not the case for ``plt.savefig``, which typically should be called before ``plt.show``.\n",
"For graphical backends, showing and subsequently closing the window of a Figure results in the Figure being cleared from the matplotlib system. This is not the case for ``plt.savefig``, which typically should be called before ``plt.show``.\n",
"\n",
"It is also possible to manually close figures without showing them by using the ``plt.close`` function. This could be called to remove the current figure after saving it with ``plt.savefig`` on the occasion where not clearing the figure might interfere with subsequent plots to be created."
"It is also possible to manually close Figures without showing them by using the ``plt.close`` function. This could be called to remove the current Figure after saving it with ``plt.savefig`` on the occasion where not clearing the Figure might interfere with subsequent plots to be created."
]
}
],
"metadata": {
"hide_input": false,
"kernelspec": {
"display_name": "Python 2",
"language": "python",
@ -548,20 +610,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.11+"
},
"latex_envs": {
"bibliofile": "biblio.bib",
"cite_by": "apalike",
"current_citInitial": 1,
"eqLabelWithNumbers": true,
"eqNumInitial": 0
},
"widgets": {
"state": {},
"version": "0.3.0"
"version": "2.7.12"
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 0
}

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@ -1,57 +1,42 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 1:\n",
"\n",
"1\\. Using the file in ``iris.sample_data_path('atlantic_profiles.nc')`` load the data and print the cube list. Store these cubes in a variable called cubes.\n",
"\n",
"2\\. Print a sorted list of unique names for the cubes.\n",
"\n",
"3\\. Extract the \"sea_water_potential_temperature\" cube. Print the minimum, maximum, mean and standard deviation of the cube's data.\n",
"\n",
"4\\. Print the attributes of the cube.\n",
"\n",
"5\\. Print the names of all coordinates on the cube. (Hint: Remember the cube.coords method)\n",
"\n",
"6\\. Get hold of the \"latitude\" coordinate on the cube. Identify whether the cube has bounds. Print the minimum and maximum latitude points in this cube."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": ""
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 1:\n",
"\n",
"1\\. Using the file in ``iris.sample_data_path('atlantic_profiles.nc')`` load the data and print the cube list. Store these cubes in a variable called cubes.\n",
"\n",
"2\\. Print a sorted list of unique names for the cubes.\n",
"\n",
"3\\. Extract the \"sea_water_potential_temperature\" cube. Print the minimum, maximum, mean and standard deviation of the cube's data.\n",
"\n",
"4\\. Print the attributes of the cube.\n",
"\n",
"5\\. Print the names of all coordinates on the cube. (Hint: Remember the cube.coords method)\n",
"\n",
"6\\. Get hold of the \"latitude\" coordinate on the cube. Identify whether the cube has bounds. Print the minimum and maximum latitude points in this cube."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 2
}
],
"metadata": {}
}
]
}

View File

@ -1,79 +1,80 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 2:**\n",
"Print the result of ``iris.sample_data_path('uk_hires.pp')`` to verify that it returns a string pointing to a file on your system. Use this string directly in the call to ``iris.load`` and confirm the result is the same as in the previous example e.g.:\n",
"\n",
" print iris.load('/path/to/iris/sampledata/uk_hires.pp', 'air_potential_temperature')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 2 continued:** Read in the files found at ``iris.sample_data_path('GloSea4', 'ensemble_010.pp')`` and ``iris.sample_data_path('GloSea4', 'ensemble_011.pp')`` using a single load call. Do this by:\n",
"\n",
"1\\. Providing a list of the two filenames."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import iris\n",
"print iris.load([iris.sample_data_path('GloSea4', 'ensemble_010.pp'),\n",
" iris.sample_data_path('GloSea4', 'ensemble_011.pp')])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Providing a suitable glob pattern."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"print iris.load(iris.sample_data_path('GloSea4', 'ensemble_01[12].pp'))"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": ""
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 2:**\n",
"Print the result of ``iris.sample_data_path('uk_hires.pp')`` to verify that it returns a string pointing to a file on your system. Use this string directly in the call to ``iris.load`` and confirm the result is the same as in the previous example e.g.:\n",
"\n",
" print iris.load('/path/to/iris/sampledata/uk_hires.pp', 'air_potential_temperature')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 2 continued:** Read in the files found at ``iris.sample_data_path('GloSea4', 'ensemble_010.pp')`` and ``iris.sample_data_path('GloSea4', 'ensemble_011.pp')`` using a single load call. Do this by:\n",
"\n",
"1\\. Providing a list of the two filenames."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris\n",
"print iris.load([iris.sample_data_path('GloSea4', 'ensemble_010.pp'),\n",
" iris.sample_data_path('GloSea4', 'ensemble_011.pp')])"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"0: surface_temperature / (K) (realization: 2; time: 6; latitude: 145; longitude: 192)\n"
]
}
],
"prompt_number": 2
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Providing a suitable glob pattern."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"print iris.load(iris.sample_data_path('GloSea4', 'ensemble_01[12].pp'))"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"0: surface_temperature / (K) (time: 6; forecast_reference_time: 2; latitude: 145; longitude: 192)\n"
]
}
],
"prompt_number": 3
}
],
"metadata": {}
}
]
}

View File

@ -1,39 +1,24 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 3:**\n",
"\n",
"1. Write a function, which when given a cube, returns True or False depending on whether a cell method exists.\n",
"2. Use this function as a value for the ``iris.Constraint`` **cube_func** keyword, and load the file in ``iris.sample_data_path('A1B_north_america.nc')`` such that only cubes with cell methods are loaded (note: in this case, that is all that exists in the file)."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": ""
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 3:**\n",
"\n",
"1. Write a function, which when given a cube, returns True or False depending on whether a cell method exists.\n",
"2. Use this function as a value for the ``iris.Constraint`` **cube_func** keyword, and load the file in ``iris.sample_data_path('A1B_north_america.nc')`` such that only cubes with cell methods are loaded (note: in this case, that is all that exists in the file)."
]
}
],
"metadata": {}
}
]
}

View File

@ -1,58 +1,43 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 4\n",
"\n",
"The following exercise is designed to give you experience of identifying why two cubes are not merging. Work is underway to make this identification process more automatic, but the resolution of the identified differences will still be a necessary process.\n",
"\n",
"There are 6 problems, each of which are not merging into a single cube as desired. In no particular order the problems are:\n",
"\n",
" 1. one of the cubes has a history attribute, but the other doesn't\n",
" 2. one of the cubes has bounds on the spatial coordinates, but the other doesn't\n",
" 3. the two cubes have different time coordinate units\n",
" 4. the two cubes have different data dtypes\n",
" 5. the two cubes have different long names\n",
" 6. the two cubes have different shapes (the data must currently be loaded to correct this)\n",
" \n",
"The files can be found in the repository along with this course in ```exercises/iris/merge/```. There are two files to be loaded for each exercise: ```merge_exercise.{problem_number}.f1.nc``` and ```merge_exercise.{problem_number}.f2.nc```.\n",
"\n",
"Identify, and correct, the reason that the two cubes are not merging for all 6 sets of files."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Problem 1"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": ""
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 4\n",
"\n",
"The following exercise is designed to give you experience of identifying why two cubes are not merging. Work is underway to make this identification process more automatic, but the resolution of the identified differences will still be a necessary process.\n",
"\n",
"There are 6 problems, each of which are not merging into a single cube as desired. In no particular order the problems are:\n",
"\n",
" 1. one of the cubes has a history attribute, but the other doesn't\n",
" 2. one of the cubes has bounds on the spatial coordinates, but the other doesn't\n",
" 3. the two cubes have different time coordinate units\n",
" 4. the two cubes have different data dtypes\n",
" 5. the two cubes have different long names\n",
" 6. the two cubes have different shapes (the data must currently be loaded to correct this)\n",
" \n",
"The files can be found in the repository along with this course in ```exercises/iris/merge/```. There are two files to be loaded for each exercise: ```merge_exercise.{problem_number}.f1.nc``` and ```merge_exercise.{problem_number}.f2.nc```.\n",
"\n",
"Identify, and correct, the reason that the two cubes are not merging for all 6 sets of files."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Problem 1"
]
}
],
"metadata": {}
}
]
}

View File

@ -1,61 +1,44 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import iris.analysis.cartography\n",
"cube.coord('grid_latitude').guess_bounds()\n",
"cube.coord('grid_longitude').guess_bounds()\n",
"grid_areas = iris.analysis.cartography.area_weights(cube)\n",
"\n",
"area_avg = cube.collapsed(['grid_longitude', 'grid_latitude'], iris.analysis.MEAN, weights=grid_areas)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 5:** What other aggregators are available? Calculate the potential temperature variance with time for the area averaged cube (hint: We want to reduce the vertical dimension, and end up with a cube of length 3). Print the data values of the resulting cube."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": ""
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris.analysis.cartography\n",
"cube.coord('grid_latitude').guess_bounds()\n",
"cube.coord('grid_longitude').guess_bounds()\n",
"grid_areas = iris.analysis.cartography.area_weights(cube)\n",
"\n",
"area_avg = cube.collapsed(['grid_longitude', 'grid_latitude'], iris.analysis.MEAN, weights=grid_areas)"
],
"language": "python",
"metadata": {},
"outputs": []
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Exercise 5:** What other aggregators are available? Calculate the potential temperature variance with time for the area averaged cube (hint: We want to reduce the vertical dimension, and end up with a cube of length 3). Print the data values of the resulting cube."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [],
"language": "python",
"metadata": {},
"outputs": []
}
],
"metadata": {}
}
]
}

File diff suppressed because one or more lines are too long

View File

@ -1,183 +1,254 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Graduation exercise\n",
"\n",
"1\\. Load 'A1B_north_america.nc' from the iris sample data"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import iris\n",
"filename = iris.sample_data_path(\"A1B_north_america.nc\")\n",
"cube = iris.load_cube(filename)\n",
"print cube"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Extract just data from the year 1980 and beyond from the loaded cube"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"tcoord = cube.coord('time')\n",
"def since_1980(cell):\n",
" return tcoord.units.num2date(cell.point).year >= 1980\n",
"\n",
"tcon = iris.Constraint(time=since_1980)\n",
"cube = cube.extract(tcon)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"tcoord = cube.coord('time')\n",
"\n",
"print tcoord.units.num2date(tcoord.points.min())\n",
"print tcoord.units.num2date(tcoord.points.max())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"3\\. Define a function which takes a coordinate and a single time point as arguments, and returns the decade. For example, your function should return 2010 for the following:\n",
"\n",
" time = iris.coords.DimCoord([10], 'time', units='days since 2018-01-01')\n",
" print your_decade_function(time, time.points[0])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"def get_decade(coord, point):\n",
" year = coord.units.num2date(point).year\n",
" return (year/10)*10\n",
"time = iris.coords.DimCoord([10], 'time', units='days since 2018-01-01')\n",
"print get_decade(time, time.points[0])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"4\\. Add a \"decade\" coordinate to the loaded cube using your function and the coord categorisation module"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import iris.coord_categorisation as coord_cat\n",
"coord_cat.add_categorised_coord(cube, 'decade', 'time', get_decade)\n",
"print cube.coord('decade')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"5\\. Calculate the decadal means cube for this scenario"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import iris.analysis\n",
"cube = cube.aggregated_by('decade', iris.analysis.MEAN)\n",
"print cube"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"6\\. Create a figure with 3 rows and 4 columns displaying the decadal means, with the decade displayed prominently in each axes' title"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"import iris.plot as iplt\n",
"\n",
"plt.figure(figsize=(12, 6))\n",
"\n",
"plt.suptitle('Decadal means for the A1B scenario')\n",
"for i, decade_cube in enumerate(cube.slices(['latitude', 'longitude'])):\n",
" plt.subplot(3, 4, i+1)\n",
" iplt.contourf(decade_cube, 20)\n",
" plt.title('{}'.format(decade_cube.coord('decade').points[0]))\n",
" plt.gca().coastlines()\n",
"plt.show()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": "iris_exercise_7"
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Graduation exercise\n",
"\n",
"1\\. Load 'A1B_north_america.nc' from the iris sample data"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris\n",
"filename = iris.sample_data_path(\"A1B_north_america.nc\")\n",
"cube = iris.load_cube(filename)\n",
"print cube"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"air_temperature / (K) (time: 240; latitude: 37; longitude: 49)\n",
" Dimension coordinates:\n",
" time x - -\n",
" latitude - x -\n",
" longitude - - x\n",
" Auxiliary coordinates:\n",
" forecast_period x - -\n",
" Scalar coordinates:\n",
" forecast_reference_time: 1859-09-01 06:00:00\n",
" height: 1.5 m\n",
" Attributes:\n",
" Conventions: CF-1.5\n",
" Model scenario: A1B\n",
" STASH: m01s03i236\n",
" source: Data from Met Office Unified Model 6.05\n",
" Cell methods:\n",
" mean: time (6 hour)\n"
]
}
],
"prompt_number": 2
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"2\\. Extract just data from the year 1980 and beyond from the loaded cube"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"tcoord = cube.coord('time')\n",
"def since_1980(cell):\n",
" return tcoord.units.num2date(cell.point).year >= 1980\n",
"\n",
"tcon = iris.Constraint(time=since_1980)\n",
"cube = cube.extract(tcon)"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 3
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"tcoord = cube.coord('time')\n",
"\n",
"print tcoord.units.num2date(tcoord.points.min())\n",
"print tcoord.units.num2date(tcoord.points.max())"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"1980-06-01 00:00:00\n",
"2099-06-01 00:00:00\n"
]
}
],
"prompt_number": 4
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"3\\. Define a function which takes a coordinate and a single time point as arguments, and returns the decade. For example, your function should return 2010 for the following:\n",
"\n",
" time = iris.coords.DimCoord([10], 'time', units='days since 2018-01-01')\n",
" print your_decade_function(time, time.points[0])"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"def get_decade(coord, point):\n",
" year = coord.units.num2date(point).year\n",
" return (year/10)*10\n",
"time = iris.coords.DimCoord([10], 'time', units='days since 2018-01-01')\n",
"print get_decade(time, time.points[0])"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2010\n"
]
}
],
"prompt_number": 5
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"4\\. Add a \"decade\" coordinate to the loaded cube using your function and the coord categorisation module"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris.coord_categorisation as coord_cat\n",
"coord_cat.add_categorised_coord(cube, 'decade', 'time', get_decade)\n",
"print cube.coord('decade')"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"AuxCoord(array([1980, 1980, 1980, 1980, 1980, 1980, 1980, 1980, 1980, 1980, 1990,\n",
" 1990, 1990, 1990, 1990, 1990, 1990, 1990, 1990, 1990, 2000, 2000,\n",
" 2000, 2000, 2000, 2000, 2000, 2000, 2000, 2000, 2010, 2010, 2010,\n",
" 2010, 2010, 2010, 2010, 2010, 2010, 2010, 2020, 2020, 2020, 2020,\n",
" 2020, 2020, 2020, 2020, 2020, 2020, 2030, 2030, 2030, 2030, 2030,\n",
" 2030, 2030, 2030, 2030, 2030, 2040, 2040, 2040, 2040, 2040, 2040,\n",
" 2040, 2040, 2040, 2040, 2050, 2050, 2050, 2050, 2050, 2050, 2050,\n",
" 2050, 2050, 2050, 2060, 2060, 2060, 2060, 2060, 2060, 2060, 2060,\n",
" 2060, 2060, 2070, 2070, 2070, 2070, 2070, 2070, 2070, 2070, 2070,\n",
" 2070, 2080, 2080, 2080, 2080, 2080, 2080, 2080, 2080, 2080, 2080,\n",
" 2090, 2090, 2090, 2090, 2090, 2090, 2090, 2090, 2090, 2090]), standard_name=None, units=Unit('1'), long_name=u'decade')\n"
]
}
],
"prompt_number": 6
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"5\\. Calculate the decadal means cube for this scenario"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris.analysis\n",
"cube = cube.aggregated_by('decade', iris.analysis.MEAN)\n",
"print cube"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"air_temperature / (K) (time: 12; latitude: 37; longitude: 49)\n",
" Dimension coordinates:\n",
" time x - -\n",
" latitude - x -\n",
" longitude - - x\n",
" Auxiliary coordinates:\n",
" decade x - -\n",
" forecast_period x - -\n",
" Scalar coordinates:\n",
" forecast_reference_time: 1859-09-01 06:00:00\n",
" height: 1.5 m\n",
" Attributes:\n",
" Conventions: CF-1.5\n",
" Model scenario: A1B\n",
" STASH: m01s03i236\n",
" history: Mean of air_temperature aggregated over decade\n",
" source: Data from Met Office Unified Model 6.05\n",
" Cell methods:\n",
" mean: time (6 hour)\n",
" mean: decade\n"
]
}
],
"prompt_number": 7
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"6\\. Create a figure with 3 rows and 4 columns displaying the decadal means, with the decade displayed prominently in each axes' title"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import matplotlib.pyplot as plt\n",
"import iris.plot as iplt\n",
"\n",
"plt.figure(figsize=(12, 6))\n",
"\n",
"plt.suptitle('Decadal means for the A1B scenario')\n",
"for i, decade_cube in enumerate(cube.slices(['latitude', 'longitude'])):\n",
" plt.subplot(3, 4, i+1)\n",
" iplt.contourf(decade_cube, 20)\n",
" plt.title('{}'.format(decade_cube.coord('decade').points[0]))\n",
" plt.gca().coastlines()\n",
"plt.show()"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 16
}
],
"metadata": {}
}
]
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,99 +1,111 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 1\n",
"\n",
"Use ``np.arange`` and ``reshape`` to create the array\n",
"\n",
" A = [[1 2 3 4]\n",
" [5 6 7 8]]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"import numpy as np\n",
"A = np.arange(1, 9).reshape(2, -1)\n",
"print A"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"Use ``np.array`` to create the array\n",
"\n",
" B = [1 2]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"B = np.array([1, 2])\n",
"print B"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Use broadcasting to add ``B`` to ``A`` to create the final array\n",
"\n",
" A + B = [[2 3 4 5]\n",
" [7 8 9 10]\n",
"\n",
"Hint: what shape does ``B`` have to be changed to?"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"print A + B.reshape(2, 1)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": "",
"signature": "sha256:848ef38126532012d51f85fda6138dfe097a4c6c37850c846063ebee9d3928bb"
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Exercise 1\n",
"\n",
"Use ``np.arange`` and ``reshape`` to create the array\n",
"\n",
" A = [[1 2 3 4]\n",
" [5 6 7 8]]"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import numpy as np\n",
"A = np.arange(1, 9).reshape(2, -1)\n",
"print A"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"[[1 2 3 4]\n",
" [5 6 7 8]]\n"
]
}
],
"prompt_number": 1
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"Use ``np.array`` to create the array\n",
"\n",
" B = [1 2]"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"B = np.array([1, 2])\n",
"print B"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"[1 2]\n"
]
}
],
"prompt_number": 2
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Use broadcasting to add ``B`` to ``A`` to create the final array\n",
"\n",
" A + B = [[2 3 4 5]\n",
" [7 8 9 10]\n",
"\n",
"Hint: what shape does ``B`` have to be changed to?"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"print A + B.reshape(2, 1)"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"[[ 2 3 4 5]\n",
" [ 7 8 9 10]]\n"
]
}
],
"prompt_number": 3
}
],
"metadata": {}
}
]
}

View File

@ -1,268 +1,305 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"# Exercise: trapezoidal integration\n",
"\n",
"In this exercise, you are tasked with implementing the simple trapezoid rule\n",
"formula for numerical integration. If we want to compute the definite integral\n",
"\n",
"$$\n",
" \\int_{a}^{b}f(x)dx\n",
"$$\n",
"\n",
"we can partition the integration interval $[a,b]$ into smaller subintervals. We then approximate the area under the curve for each subinterval by calculating the area of the trapezoid created by linearly interpolating between the two function values at each end of the subinterval:\n",
"\n",
"![Illustration of the trapezoidal rule](../images/trapezoidal_rule.png)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"For a pre-computed $y$ array (where $y = f(x)$ at discrete samples) the trapezoidal rule equation is:\n",
"\n",
"$$\n",
" \\int_{a}^{b}f(x)dx\\approx\\frac{1}{2}\\sum_{i=1}^{n}\\left(x_{i}-x_{i-1}\\right)\\left(y_{i}+y_{i-1}\\right).\n",
"$$\n",
"\n",
"In pure python, this can be written as:\n",
"\n",
" def trapz_slow(x, y):\n",
" area = 0.\n",
" for i in range(1, len(x)):\n",
" area += (x[i] - x[i-1]) * (y[i] + y[i-1])\n",
" return area / 2"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Exercise 2\n",
"\n",
"#### Part 1\n",
"\n",
"Create two arrays $x$ and $y$, where $x$ is a linearly spaced array in the interval $[0, 3]$ of length 10, and $y$ represents the function $f(x) = x^2$ sampled at $x$."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"x = np.linspace(0, 3, 10)\n",
"y = x ** 2\n",
"\n",
"print x\n",
"print y"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 2\n",
"\n",
"Use indexing (not a for loop) to find the 9 values representing $y_{i}+y_{i-1}$ for $i$ between 1 and 10.\n",
"\n",
"Hint: What indexing would be needed to get all but the last element of the 1d array **``y``**. Similarly what indexing would be needed to get all but the first element of a 1d array."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"y_roll_sum = y[:-1] + y[1:]\n",
"print y_roll_sum"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 3\n",
"\n",
"Write a function `trapz(x, y)`, that applies the trapezoid formula to pre-computed values, where `x` and `y` are 1-d arrays. The function should not use a for loop."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"def trapz(x, y):\n",
" return 0.5 * np.sum((x[1:] - x[:-1]) * (y[:-1] + y[1:]))"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 4\n",
"\n",
"Verify that your function is correct by using the arrays created in #1 as input to ``trapz``. Your answer should be a close approximation of $\\int_0^3 x^2$ which is $9$."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"trapz(x, y)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 5 (extension)\n",
"\n",
"``numpy`` and ``scipy.integrate`` provides many common integration schemes. Find the documentation for NumPy's own version of the trapezoidal integration scheme and check its result with your own:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"print np.trapz(y, x)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 6 (extension)\n",
"\n",
"Write a function `trapzf(f, a, b, npts=100)` that accepts a function `f`, the endpoints `a` and `b` and the number of samples to take `npts`. Sample the function uniformly at these\n",
"points and return the value of the integral.\n",
"\n",
"Use the trapzf function to identify the minimum number of sampling points needed to approximate the integral $\\int_0^3 x^2$ with an absolute error of $<=0.0001$. (A loop is necessary here)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"def trapzf(f, a, b, npts=100):\n",
" x = np.linspace(a, b, npts)\n",
" y = f(x)\n",
" return trapz(x, y)\n",
"\n",
"def x_squared(x):\n",
" return x ** 2\n",
"\n",
"abs_err = 1.0\n",
"n_samples = 0\n",
"expected = 9\n",
"while abs_err > 0.0001:\n",
" n_samples += 1\n",
" integral = trapzf(x_squared, 0, 3, npts=n_samples)\n",
" abs_err = np.abs(integral - 9)\n",
"\n",
"print 'Minimum samples for absolute error less than or equal to 0.0001:', n_samples\n",
" "
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 Sys",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.1+"
},
"widgets": {
"state": {},
"version": "0.3.0"
}
"name": "",
"signature": "sha256:e3e7c1608702d0a84f35e6fa63112daab968fe74a96ae1f6ee9f39993826a343"
},
"nbformat": 4,
"nbformat_minor": 1
}
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"# Exercise: trapezoidal integration\n",
"\n",
"In this exercise, you are tasked with implementing the simple trapezoid rule\n",
"formula for numerical integration. If we want to compute the definite integral\n",
"\n",
"$$\n",
" \\int_{a}^{b}f(x)dx\n",
"$$\n",
"\n",
"we can partition the integration interval $[a,b]$ into smaller subintervals. We then approximate the area under the curve for each subinterval by calculating the area of the trapezoid created by linearly interpolating between the two function values at each end of the subinterval:\n",
"\n",
"![Illustration of the trapezoidal rule](../images/trapezoidal_rule.png)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"For a pre-computed $y$ array (where $y = f(x)$ at discrete samples) the trapezoidal rule equation is:\n",
"\n",
"$$\n",
" \\int_{a}^{b}f(x)dx\\approx\\frac{1}{2}\\sum_{i=1}^{n}\\left(x_{i}-x_{i-1}\\right)\\left(y_{i}+y_{i-1}\\right).\n",
"$$\n",
"\n",
"In pure python, this can be written as:\n",
"\n",
" def trapz_slow(x, y):\n",
" area = 0.\n",
" for i in range(1, len(x)):\n",
" area += (x[i] - x[i-1]) * (y[i] + y[i-1])\n",
" return area / 2"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "subslide"
}
},
"source": [
"### Exercise 2\n",
"\n",
"#### Part 1\n",
"\n",
"Create two arrays $x$ and $y$, where $x$ is a linearly spaced array in the interval $[0, 3]$ of length 10, and $y$ represents the function $f(x) = x^2$ sampled at $x$."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import numpy as np\n",
"\n",
"x = np.linspace(0, 3, 10)\n",
"y = x ** 2\n",
"\n",
"print x\n",
"print y"
],
"language": "python",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"[ 0. 0.33333333 0.66666667 1. 1.33333333 1.66666667\n",
" 2. 2.33333333 2.66666667 3. ]\n",
"[ 0. 0.11111111 0.44444444 1. 1.77777778 2.77777778\n",
" 4. 5.44444444 7.11111111 9. ]\n"
]
}
],
"prompt_number": 1
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 2\n",
"\n",
"Use indexing (not a for loop) to find the 9 values representing $y_{i}+y_{i-1}$ for $i$ between 1 and 10.\n",
"\n",
"Hint: What indexing would be needed to get all but the last element of the 1d array **``y``**. Similarly what indexing would be needed to get all but the first element of a 1d array."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"y_roll_sum = y[:-1] + y[1:]\n",
"print y_roll_sum"
],
"language": "python",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"[ 0.11111111 0.55555556 1.44444444 2.77777778 4.55555556\n",
" 6.77777778 9.44444444 12.55555556 16.11111111]\n"
]
}
],
"prompt_number": 2
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 3\n",
"\n",
"Write a function `trapz(x, y)`, that applies the trapezoid formula to pre-computed values, where `x` and `y` are 1-d arrays. The function should not use a for loop."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"def trapz(x, y):\n",
" return 0.5 * np.sum((x[1:] - x[:-1]) * (y[:-1] + y[1:]))"
],
"language": "python",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"prompt_number": 3
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 4\n",
"\n",
"Verify that your function is correct by using the arrays created in #1 as input to ``trapz``. Your answer should be a close approximation of $\\int_0^3 x^2$ which is $9$."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"trapz(x, y)"
],
"language": "python",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"metadata": {},
"output_type": "pyout",
"prompt_number": 4,
"text": [
"9.0555555555555554"
]
}
],
"prompt_number": 4
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 5 (extension)\n",
"\n",
"``numpy`` and ``scipy.integrate`` provides many common integration schemes. Find the documentation for NumPy's own version of the trapezoidal integration scheme and check its result with your own:"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"print np.trapz(y, x)"
],
"language": "python",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"9.05555555556\n"
]
}
],
"prompt_number": 5
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "-"
}
},
"source": [
"#### Part 6 (extension)\n",
"\n",
"Write a function `trapzf(f, a, b, npts=100)` that accepts a function `f`, the endpoints `a` and `b` and the number of samples to take `npts`. Sample the function uniformly at these\n",
"points and return the value of the integral.\n",
"\n",
"Use the trapzf function to identify the minimum number of sampling points needed to approximate the integral $\\int_0^3 x^2$ with an absolute error of $<=0.0001$. (A loop is necessary here)"
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"def trapzf(f, a, b, npts=100):\n",
" x = np.linspace(a, b, npts)\n",
" y = f(x)\n",
" return trapz(x, y)\n",
"\n",
"def x_squared(x):\n",
" return x ** 2\n",
"\n",
"abs_err = 1.0\n",
"n_samples = 0\n",
"expected = 9\n",
"while abs_err > 0.0001:\n",
" n_samples += 1\n",
" integral = trapzf(x_squared, 0, 3, npts=n_samples)\n",
" abs_err = np.abs(integral - 9)\n",
"\n",
"print 'Minimum samples for absolute error less than or equal to 0.0001:', n_samples\n",
" "
],
"language": "python",
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"Minimum samples for absolute error less than or equal to 0.0001: 214\n"
]
}
],
"prompt_number": 6
}
],
"metadata": {}
}
]
}

View File

@ -1,61 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Python courses for the scientific researcher\n",
"---\n",
"This repository contains several courses for the benefit of scientific researchers,\n",
"particularly in the fields of oceanography and meteorology.\n",
"\n",
"There are currently four courses:\n",
"\n",
"* [An introduction to numpy](course_content/notebooks/numpy_intro.ipynb)\n",
" 3.5 hours &mdash; depends on a basic Python background\n",
"\n",
"* [An introduction to matplotlib](course_content/notebooks/matplotlib_intro.ipynb)\n",
" 3 hours &mdash; depends on \"An introduction to numpy\"\n",
"\n",
"* [Cartopy in a nutshell (for Iris)](course_content/notebooks/cartopy_intro.ipynb)\n",
" 0.5 hours &mdash; depends on \"An introduction to matplotlib\"\n",
"\n",
"* [An introduction to Iris](course_content/notebooks/iris_intro.ipynb)\n",
" 6 hours &mdash; depends on \"Cartopy in a nutshell\"\n",
"\n",
"\n",
"\n",
"\n"
]
}
],
"metadata": {
"hide_input": false,
"kernelspec": {
"display_name": "Python",
"language": "python",
"name": "python2"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 2.0
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.11+"
},
"latex_envs": {
"bibliofile": "biblio.bib",
"cite_by": "apalike",
"current_citInitial": 1.0,
"eqLabelWithNumbers": true,
"eqNumInitial": 0.0
}
},
"nbformat": 4,
"nbformat_minor": 0
}

View File

@ -16,23 +16,14 @@ cd build
cp -rf ../course_content/images images
cp -rf ../course_content/resources resources
cd html
for name in "numpy_intro" "matplotlib_intro" "cartopy_intro" "iris_intro"
do
#ipython nbconvert --to slides ../../course_content/${name}.ipynb
# Build static (html) copies of the course content.
jupyter nbconvert --to html ../../course_content/notebooks/${name}.ipynb
# Make IPython notebooks of the course content with cell output cleared.
python ../../utils/nbutil.py ../../course_content/notebooks/${name}.ipynb ../notebooks/${name}.ipynb --clear-output
jupyter nbconvert --to html ../course_content/notebooks/${name}.ipynb
mv ../course_content/notebooks/*.html ./html/
done
cd ../solutions
for name in "numpy_exercise_1.ipynb" "numpy_exercise_2.ipynb" "matplotlib_exercise_2.ipynb" "matplotlib_exercise_3.ipynb" "cartopy_exercise_1.ipynb" "iris_exercise_1.ipynb" "iris_exercise_2.ipynb" "iris_exercise_3.ipynb" "iris_exercise_4.ipynb" "iris_exercise_5.ipynb" "iris_exercise_6.ipynb" "iris_exercise_7.ipynb"
do
python ../../utils/nbutil.py ../../course_content/solutions/${name} ${name}
done
#.reveal aside.notes {
# visibility: inline;
#}

View File

@ -1,420 +0,0 @@
{
"metadata": {
"name": "merge_data_construct"
},
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris\n",
"import iris.tests.stock as stock"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stderr",
"text": [
"/data/local/sci/r28/lib/python2.7/site-packages/nose-1.1.2-py2.7.egg/nose/util.py:14: DeprecationWarning: The compiler package is deprecated and removed in Python 3.x.\n",
" from compiler.consts import CO_GENERATOR\n",
"/data/local/sci/r28/lib/python2.7/site-packages/nose-1.1.2-py2.7.egg/nose/plugins/manager.py:405: UserWarning: Module IPython was already imported from /data/local/itpe/git/ipython/build/lib/IPython/__init__.pyc, but /net/home/h02/itpe/.local/lib/python2.7/site-packages is being added to sys.path\n",
" import pkg_resources\n"
]
}
],
"prompt_number": 1
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cube = stock.realistic_4d()"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 2
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"sub_cubes = (cube[0, 0, ...], cube[1, 0, ...])"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 3
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"sub_cubes"
],
"language": "python",
"metadata": {},
"outputs": [
{
"metadata": {},
"output_type": "pyout",
"prompt_number": 4,
"text": [
"(<iris 'Cube' of air_potential_temperature / (K) (grid_latitude: 100; grid_longitude: 100)>,\n",
" <iris 'Cube' of air_potential_temperature / (K) (grid_latitude: 100; grid_longitude: 100)>)"
]
}
],
"prompt_number": 4
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"potential = iris.cube.CubeList(sub_cubes)\n",
"print potential.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 5
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"!rm merge_exercise.*.nc"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"rm: cannot remove `merge_exercise.*.nc': No such file or directory\r\n"
]
}
],
"prompt_number": 6
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"potential[0].attributes['History'] = 'unknown'\n",
"iris.save(potential[0], 'merge_exercise.1.f1.nc')\n",
"iris.save(potential[1], 'merge_exercise.1.f2.nc')\n",
"potential[0].attributes.pop('History')"
],
"language": "python",
"metadata": {},
"outputs": [
{
"metadata": {},
"output_type": "pyout",
"prompt_number": 7,
"text": [
"'unknown'"
]
}
],
"prompt_number": 7
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cubes = iris.load('merge_exercise.1.*.nc')\n",
"print len(cubes)\n",
"#print cubes.describe_merge()\n",
"cubes[0].attributes.pop('History')\n",
"print cubes.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2\n",
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 9
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"task = 6\n",
"potential[1].coord('grid_latitude').bounds = None\n",
"potential[1].coord('grid_longitude').bounds = None\n",
"iris.save(potential[0], 'merge_exercise.6.f1.nc')\n",
"iris.save(potential[1], 'merge_exercise.6.f2.nc')\n",
"potential[1].coord('grid_latitude').bounds = potential[0].coord('grid_latitude').bounds\n",
"potential[1].coord('grid_longitude').bounds = potential[0].coord('grid_longitude').bounds"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 10
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cubes = iris.load('merge_exercise.6.*.nc')\n",
"cubes[1].coord('grid_latitude').bounds = cubes[0].coord('grid_latitude').bounds\n",
"cubes[1].coord('grid_longitude').bounds = cubes[0].coord('grid_longitude').bounds\n",
"print len(cubes)\n",
"print cubes.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2\n",
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 11
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"c = potential[1].coord('time')\n",
"print c.units\n",
"c.convert_units('days since 1970-01-01 00:00:00')\n",
"iris.save(potential[0], 'merge_exercise.3.f1.nc')\n",
"iris.save(potential[1], 'merge_exercise.3.f2.nc')\n",
"c.convert_units('hours since 1970-01-01 00:00:00')"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"hours since 1970-01-01 00:00:00\n"
]
}
],
"prompt_number": 12
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"import iris\n",
"reload(iris)"
],
"language": "python",
"metadata": {},
"outputs": [
{
"metadata": {},
"output_type": "pyout",
"prompt_number": 13,
"text": [
"<module 'iris' from '/data/local/itpe/git/iris/lib/iris/__init__.py'>"
]
}
],
"prompt_number": 13
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cubes = iris.load('merge_exercise.3.*.nc')\n",
"cubes[1].coord('time').convert_units(cubes[0].coord('time').units)\n",
"print len(cubes)\n",
"print cubes.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2\n",
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 14
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"potential[1].data = potential[1].data.astype('float64')\n",
"iris.save(potential[0], 'merge_exercise.4.f1.nc')\n",
"iris.save(potential[1], 'merge_exercise.4.f2.nc')\n",
"potential[1].data = potential[1].data.astype('float32')"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 15
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cubes = iris.load('merge_exercise.4.*.nc')\n",
"for cube in cubes:\n",
" cube.data = cube.data.astype('float64')\n",
"print len(cubes)\n",
"print cubes.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2\n",
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 16
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"!rm -rf merge_exercise.5.??.nc\n",
"potential[0].long_name = 'The first timestep'\n",
"potential[1].long_name = 'The second timestep'\n",
"iris.save(potential[0], 'merge_exercise.2.f1.nc')\n",
"iris.save(potential[1], 'merge_exercise.2.f2.nc')\n",
"potential[0].long_name = potential[1].long_name = None"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 17
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cubes = iris.load('merge_exercise.2.*.nc')\n",
"for cube in cubes:\n",
" cube.long_name = None\n",
"print len(cubes)\n",
"print cubes.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2\n",
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 18
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"iris.save(potential[0], 'merge_exercise.5.f1.nc')\n",
"iris.save(stock.realistic_4d()[1:2, 0, ...], 'merge_exercise.5.f2.nc')"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 19
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"cubes = iris.load('merge_exercise.5.??.nc')\n",
"print len(cubes)\n",
"cubes[1] = cubes[1][0, ...]\n",
"for cube in cubes:\n",
" print cube.coord('time')\n",
" cube.data\n",
"for coord1, coord2 in zip(cubes[0].coords(), cubes[1].coords()):\n",
" print coord1 == coord2, coord1.name()\n",
" if not coord1 == coord2:\n",
" print `coord1`\n",
" print `coord2`\n",
" print '---'\n",
"#cubes.describe_merge()\n",
"print cubes.merge()"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"2\n",
"DimCoord([2009-09-09 17:10:00], standard_name=u'time', calendar=u'gregorian', var_name='time')\n",
"DimCoord([2009-09-09 17:20:00], standard_name=u'time', calendar=u'gregorian', var_name='time')\n",
"True grid_latitude\n",
"True grid_longitude\n",
"True atmosphere_hybrid_height_coordinate\n",
"True forecast_period\n",
"True model_level_number\n",
"True sigma\n",
"False time\n",
"DimCoord(array([ 347921.16666667]), standard_name=u'time', units=Unit('hours since 1970-01-01 00:00:00', calendar='gregorian'), var_name='time')\n",
"DimCoord(array([ 347921.33333333]), standard_name=u'time', units=Unit('hours since 1970-01-01 00:00:00', calendar='gregorian'), var_name='time')\n",
"---\n",
"True surface_altitude\n",
"True altitude\n",
"0: air_potential_temperature / (K) (time: 2; grid_latitude: 100; grid_longitude: 100)\n"
]
}
],
"prompt_number": 21
},
{
"cell_type": "code",
"collapsed": false,
"input": [],
"language": "python",
"metadata": {},
"outputs": []
}
],
"metadata": {}
}
]
}

View File

@ -1,51 +0,0 @@
import json
class Notebook(object):
INPUT = u'input'
OUTPUTS = u'outputs'
CELL_TYPE = 'cell_type'
def __init__(self, fh):
self.data = json.load(fh)
def __iter__(self):
for worksheet in self.data['worksheets']:
for cell in worksheet['cells']:
yield cell
def clear_code(self, keyword='# Solution:'):
for cell in self:
if cell[self.CELL_TYPE] == 'code' and cell[self.INPUT] and cell[self.INPUT][0].startswith(keyword):
cell[self.CODE_CELL] = []
cell[self.OUTPUTS] = []
def clear_output(self):
for cell in self:
if cell[self.CELL_TYPE] == 'code':
cell.pop('prompt_number', None)
cell[self.OUTPUTS] = []
def save(self, fh):
json.dump(self.data, fh, indent=4)
if __name__ == '__main__':
import sys
notebook = sys.argv[1]
if len(sys.argv) > 1:
output = sys.argv[2]
else:
output = notebook
with open(notebook) as fh:
nb = Notebook(fh)
if '--clear-output' in sys.argv:
nb.clear_output()
if '--clear-code' in sys.argv:
nb.clear_code()
with open(output, 'w') as fh:
nb.save(fh)