|
4 | 4 |
|
5 | 5 | -----------------------------------------
|
6 | 6 |
|
7 |
| -[Documentation](https://diffsharp.github.io/) |
8 |
| - |
9 | 7 | [](https://github.com/DiffSharp/DiffSharp/actions)
|
10 | 8 | [](https://coveralls.io/github/DiffSharp/DiffSharp?branch=)
|
11 | 9 |
|
12 | 10 | This is the development branch of DiffSharp 1.0.
|
13 | 11 |
|
14 | 12 | > **NOTE: This branch is undergoing development. It has incomplete code, functionality, and design that are likely to change without notice.**
|
15 | 13 |
|
16 |
| -## Getting Started |
17 |
| - |
18 |
| -DiffSharp is normally used from an F# Jupyter notebook. You can simply open examples directly in the browser, e.g. |
19 |
| - |
20 |
| -* [index.ipynb](https://mybinder.org/v2/gh/diffsharp/diffsharp.github.io/master?filepath=index.ipynb) |
21 |
| - |
22 |
| -* [getting-started-install.ipynb](https://mybinder.org/v2/gh/diffsharp/diffsharp.github.io/master?filepath=getting-started-install.ipynb) |
23 |
| - |
24 |
| -To use locally in [Visual Studio Code](https://code.visualstudio.com/): |
25 |
| - |
26 |
| -- Install [.NET Interactive Notebooks for VS Code](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.dotnet-interactive-vscode) |
27 |
| - |
28 |
| -- After opening an `.ipynb` execute `Ctrl-Shift-P` for the command palette and chose `Reopen Editor With...` then `.NET Interactive Notebooks` |
29 |
| - |
30 |
| -- To restart the kernel use `restart` from the command palette. |
31 |
| - |
32 |
| -To use locally in Jupyter, first install Jupyter and then: |
33 |
| - |
34 |
| - dotnet tool install -g microsoft.dotnet-interactive |
35 |
| - dotnet interactive jupyter install |
36 |
| - |
37 |
| -When using .NET Interactive it is best to completely turn off automatic HTML displays of outputs: |
38 |
| - |
39 |
| - Formatter.SetPreferredMimeTypesFor(typeof<obj>, "text/plain") |
40 |
| - Formatter.Register(fun x writer -> fprintfn writer "%120A" x ) |
41 |
| - |
42 |
| -You can also use DiffSharp from a script or an application. Here are some example scripts with appropriate package references: |
43 |
| - |
44 |
| -* [docs/index.fsx](http://diffsharp.github.io/index.fsx) |
45 |
| - |
46 |
| -* [docs/getting-started-install.fsx](http://diffsharp.github.io/getting-started-install.fsx) |
47 |
| - |
48 |
| -## Available packages and backends |
49 |
| - |
50 |
| -Now reference an appropriate nuget package from https://nuget.org: |
51 |
| - |
52 |
| -* [`DiffSharp-lite`](https://www.nuget.org/packages/DiffSharp-lite) - This is the reference backend. |
53 |
| - |
54 |
| -* [`DiffSharp-cpu`](https://www.nuget.org/packages/DiffSharp-cpu) - This includes the Torch backend using CPU only. |
55 |
| - |
56 |
| -* [`DiffSharp-cuda-linux`](https://www.nuget.org/packages/DiffSharp-cuda-linux) - This includes the Torch CPU/CUDA 11.1 backend for Linux. Large download. Requires .NET 6 SDK, version `6.0.100-preview.5.21302.13` or greater. |
57 |
| - |
58 |
| -* [`DiffSharp-cuda-windows`](https://www.nuget.org/packages/DiffSharp-cuda-windows) - This includes the Torch CPU/CUDA 11.1 backend for Windows. Large download. |
| 14 | +DiffSharp is a tensor library with support for [differentiable programming](https://en.wikipedia.org/wiki/Differentiable_programming). It is designed for use in machine learning, probabilistic programming, optimization and other domains. |
59 | 15 |
|
60 |
| -For all but `DiffSharp-lite` add the following to your code: |
| 16 | +**Key features** |
61 | 17 |
|
62 |
| - dsharp.config(backend=Backend.Torch) |
| 18 | +* Nested and mixed-mode differentiation |
| 19 | +* Common optimizers, model elements, differentiable probability distributions |
| 20 | +* F# for robust functional programming |
| 21 | +* PyTorch familiar naming and idioms, efficient LibTorch CUDA/C++ tensors with GPU support |
| 22 | +* Linux, macOS, Windows supported |
| 23 | +* Use interactive notebooks in Jupyter and Visual Studio Code |
| 24 | +* 100% open source |
63 | 25 |
|
64 |
| -## Using a pre-installed or self-built LibTorch 1.8.0 |
| 26 | +## Documentation |
65 | 27 |
|
66 |
| -The Torch CPU and CUDA packages above are large. If you already have `libtorch` 1.8.0 available on your machine you can |
| 28 | +You can find the documentation [here](https://diffsharp.github.io/), including information on installation and getting started. |
67 | 29 |
|
68 |
| -1. reference `DiffSharp-lite` |
| 30 | +## Communication |
69 | 31 |
|
70 |
| -2. set `LD_LIBRARY_PATH` to include a directory containing the relevant `torch.so`, `torch_cpu.so` and `torch_cuda.so`, or |
71 |
| - execute [NativeLibrary.Load](https://docs.microsoft.com/en-us/dotnet/api/system.runtime.interopservices.nativelibrary.load?view=net-5.0) on |
72 |
| - `torch.so`. |
| 32 | +Please use [GitHub issues](https://github.com/DiffSharp/DiffSharp/issues) to share bug reports, feature requests, installation issues, suggestions etc. |
73 | 33 |
|
74 |
| -3. use `dsharp.config(backend=Backend.Torch)` |
| 34 | +## Contributing |
75 | 35 |
|
76 |
| -## Developing DiffSharp Libraries |
| 36 | +We welcome all contributions. |
77 | 37 |
|
78 |
| -To develop libraries built on DiffSharp, do the following: |
| 38 | +* Bug fixes: if you encounter a bug, please open an [issue](https://github.com/DiffSharp/DiffSharp/issues) describing the bug. If you are planning to contribute a bug fix, please feel free to do so in a pull request. |
| 39 | +* New features: if you plan to contribute new features, please first open an [issue](https://github.com/DiffSharp/DiffSharp/issues) to discuss the feature before creating a pull request. |
79 | 40 |
|
80 |
| -1. reference `DiffSharp.Core` and `DiffSharp.Data` in your library code. |
| 41 | +## The Team |
81 | 42 |
|
82 |
| -2. reference `DiffSharp.Backends.Reference` in your correctness testing code. |
| 43 | +DiffSharp is developed by [Atılım Güneş Baydin](http://www.robots.ox.ac.uk/~gunes/), [Don Syme](https://www.microsoft.com/en-us/research/people/dsyme/) and other contributors, having started as a project supervised by the automatic differentiation wizards [Barak Pearlmutter](https://scholar.google.com/citations?user=AxFrw0sAAAAJ&hl=en) and [Jeffrey Siskind](https://scholar.google.com/citations?user=CgSBtPYAAAAJ&hl=en). |
83 | 44 |
|
84 |
| -3. reference `DiffSharp.Backends.Torch` and `libtorch-cpu` in your CPU testing code. |
| 45 | +## License |
85 | 46 |
|
86 |
| -4. reference `DiffSharp.Backends.Torch` and `libtorch-cuda-linux` or `libtorch-cuda-windows` in your (optional) GPU testing code. |
| 47 | +DiffSharp is licensed under the BSD 2-Clause "Simplified" License, which you can find in the [LICENSE](https://github.com/DiffSharp/DiffSharp/blob/dev/LICENSE) file in this repository. |
0 commit comments