8000 upgrading to 1.3.3 · vyraun/stanford-corenlp-python@57b8346 · GitHub
[go: up one dir, main page]

Skip to content

Commit 57b8346

Browse files
committed
upgrading to 1.3.3
1 parent cdf70c7 commit 57b8346

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

README.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Python interface to Stanford Core NLP tools v1.3.1
1+
# Python interface to Stanford Core NLP tools v1.3.3
22

33
This is a Python wrapper for Stanford University's NLP group's Java-based [CoreNLP tools](http://nlp.stanford.edu/software/corenlp.shtml). It can either be imported as a module or run as a JSON-RPC server. Because it uses many large trained models (requiring 3GB RAM on 64-bit machines and usually a few minutes loading time), most applications will probably want to run it as a server.
44

@@ -10,7 +10,7 @@ This is a Python wrapper for Stanford University's NLP group's Java-based [CoreN
1010

1111
It requires [pexpect](http://www.noah.org/wiki/pexpect) and (optionally) [unidecode](http://pypi.python.org/pypi/Unidecode) to handle non-ASCII text. This script includes and uses code from [jsonrpc](http://www.simple-is-better.org/rpc/) and [python-progressbar](http://code.google.com/p/python-progressbar/).
1212

13-
It runs the Stanford CoreNLP jar in a separate process, communicates with the java process using its command-line interface, and makes assumptions about the output of the parser in order to parse it into a Python dict object and transfer it using JSON. The parser will break if the output changes significantly, but it has been tested on **Core NLP tools version 1.3.1** released 2012-04-09.
13+
It runs the Stanford CoreNLP jar in a separate process, communicates with the java process using its command-line interface, and makes assumptions about the output of the parser in order to parse it into a Python dict object and transfer it using JSON. The parser will break if the output changes significantly, but it has been tested on **Core NLP tools version 1.3.3** released 2012-07-09.
1414

1515
## Download and Usage
1616

@@ -19,10 +19,10 @@ To use this program you must [download](http://nlp.stanford.edu/software/corenlp
1919
In other words:
2020

2 FECA 121
sudo pip install pexpect unidecode # unidecode is optional
22-
git clone git://github.com/dasmith/stanford-corenlp-python.git
23-
cd stanford-corenlp-python
24-
wget http://nlp.stanford.edu/software/stanford-corenlp-2012-04-09.tgz
25-
tar xvfz stanford-corenlp-2012-04-09.tgz
22+
git clone git://github.com/dasmith/stanford-corenlp-python.git
23+
cd stanford-corenlp-python
24+
wget http://nlp.stanford.edu/software/stanford-corenlp-2012-07-09.tgz
25+
tar xvfz stanford-corenlp-2012-07-09.tgz
2626

2727
Then, to launch a server:
2828

@@ -113,6 +113,7 @@ To use it in a regular script or to edit/debug it (because errors via RPC are op
113113
corenlp.parse("Parse it")
114114

115115
<!--
116+
116117
## Adding WordNet
117118
118119
Note: wordnet doesn't seem to be supported using this approach. Looks like you'll need Java.
@@ -122,6 +123,7 @@ tar xvfz WNprolog-3.0.tar.gz
122123
123124
-->
124125

126+
125127
## Questions
126128

127129
**Stanford CoreNLP tools require a large amount of free memory**. Java 5+ uses about 50% more RAM on 64-bit machines than 32-bit machines. 32-bit machine users can lower the memory requirements by changing `-Xmx3g` to `-Xmx2g` or even less.
@@ -131,6 +133,7 @@ If pexpect timesout while loading models, check to make sure you have enough mem
131133

132134
You can reach me, Dustin Smith, by sending a message on GitHub or through email (contact information is available [on my webpage](http://web.media.mit.edu/~dustin)).
133135

136+
134137
# Contributors
135138

136139
This is free and open source software and has benefited from the contribution and feedback of others. Like Stanford's CoreNLP tools, it is covered under the [GNU General Public License v2 +](http://www.gnu.org/licenses/gpl-2.0.html), which in short means that modifications to this program must maintain the same free and open source distribution policy.

0 commit comments

Comments
 (0)
0