- Introduction
- Tables and arrays
- Higher-level operations on strings
- Paths and Directories
- Date and Time
- Data
- Functional Programming
- Additional Libraries
- Technical Choices
This package provides portable functions and variables to manipulate the file system :
- Manipulating filenames : functions for manipulating filenames ;
- Directory functions : functions for listing and manipulating directories ;
- Directory paths : paths to well known directories ;
- Miscellaneous : uncategorized functions ;
- LPeg - Parsing Expression Grammars For Lua
- LPegLabel - Parsing Expression Grammars (with Labels) for Lua
- luv - Low level interface to
libuv
Torch is the main package in Torch7 where data structures for multi-dimensional tensors and mathematical operations over these are defined. Additionally, it provides many utilities for accessing files, serializing objects of arbitrary types and other useful utilities.
- Tensor Library
- Tensor defines the all powerful tensor object that provides multi-dimensional numerical arrays with type templating.
- Mathematical operations that are defined for the tensor object types.
- Storage defines a simple storage interface that controls the underlying storage for any tensor object.
- File I/O Interface Library
- File is an abstract interface for common file operations.
- Disk File defines operations on files stored on disk.
- Memory File defines operations on stored in RAM.
- Pipe File defines operations for using piped commands.
- High-Level File operations defines higher-level serialization functions.
- Useful Utilities
- Timer provides functionality for measuring time.
- Tester is a generic tester framework.
- CmdLine is a command line argument parsing utility.
- Random defines a random number generator package with various distributions.
- Finally useful utility functions are provided for easy handling of torch tensor types and class inheritance.
This package provides an easy and modular way to build and train simple or complex neural networks using Torch:
- Modules are the bricks used to build neural networks. Each are themselves neural networks, but can be combined with other networks using containers to create complex neural networks:
- Module : abstract class inherited by all modules;
- Containers : container classes like Sequential, Parallel and Concat;
- Transfer functions : non-linear functions like Tanh and Sigmoid;
- Simple layers : like Linear, Mean, Max and Reshape;
- Table layers : layers for manipulating tables like SplitTable, ConcatTable and JoinTable;
- Convolution layers : Temporal, Spatial and Volumetric convolutions ;
- Criterions compute a gradient according to a given loss function given an input and a target:
- Criterions : a list of all criterions, including Criterion, the abstract class;
- MSECriterion : the Mean Squared Error criterion used for regression;
- ClassNLLCriterion : the Negative Log Likelihood criterion used for classification;
- Additional documentation :
- Overview of the package essentials including modules, containers and training;
- Training : how to train a neural network using StochasticGradient;
- Testing : how to test your modules.
This package contains several optimization routines and a logger for Torch:
The Autograd package provide automatic differentiation of Torch expressions