Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit c4c03c8

Browse filesBrowse files
authored
Add centralized GPU admonition for JAX lectures (#447)
* Add centralized GPU admonition for JAX lectures - Create _admonition/gpu.md for single-source GPU notice - Update jax_intro.md to use include directive - Update numpy_vs_numba_vs_jax.md to use include directive This makes it easier to maintain consistent GPU notices across all JAX-related lectures. * Fix grammar: target -> targets * Simplify GPU admonition text
1 parent 594aeac commit c4c03c8
Copy full SHA for c4c03c8

File tree

Expand file treeCollapse file tree

3 files changed

+7
-20
lines changed
Open diff view settings
Filter options
Expand file treeCollapse file tree

3 files changed

+7
-20
lines changed
Open diff view settings
Collapse file

‎lectures/_admonition/gpu.md‎

Copy file name to clipboard
+5Lines changed: 5 additions & 0 deletions
  • Display the source diff
  • Display the rich diff
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
```{admonition} GPU
2+
:class: warning
3+
4+
This lecture is designed to run on a GPU. To use Google Colab's free GPUs, click the play icon top right, select Colab, and set the runtime to include a GPU. For local GPU setup, see the [JAX installation guide](https://github.com/google/jax).
5+
```
Collapse file

‎lectures/jax_intro.md‎

Copy file name to clipboardExpand all lines: lectures/jax_intro.md
+1-10Lines changed: 1 addition & 10 deletions
  • Display the source diff
  • Display the rich diff
Original file line numberDiff line numberDiff line change
@@ -33,16 +33,7 @@ In addition to what's in Anaconda, this lecture will need the following librarie
3333
!pip install jax quantecon
3434
```
3535

36-
```{admonition} GPU
37-
:class: warning
38-
39-
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and targets JAX for GPU programming.
40-
41-
Free GPUs are available on Google Colab.
42-
To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
43-
44-
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support.
45-
If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
36+
```{include} _admonition/gpu.md
4637
```
4738

4839
## JAX as a NumPy Replacement
Collapse file

‎lectures/numpy_vs_numba_vs_jax.md‎

Copy file name to clipboardExpand all lines: lectures/numpy_vs_numba_vs_jax.md
+1-10Lines changed: 1 addition & 10 deletions
  • Display the source diff
  • Display the rich diff
Original file line numberDiff line numberDiff line change
@@ -48,16 +48,7 @@ tags: [hide-output]
4848
!pip install quantecon jax
4949
```
5050

51-
```{admonition} GPU
52-
:class: warning
53-
54-
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and target JAX for GPU programming.
55-
56-
Free GPUs are available on Google Colab.
57-
To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
58-
59-
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support.
60-
If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
51+
```{include} _admonition/gpu.md
6152
```
6253

6354
We will use the following imports.

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.