8000 revised includes to update support for v1.3.3 · orazaro/stanford-corenlp-python@f86fdd9 · GitHub
[go: up one dir, main page]

Skip to content

Commit f86fdd9

Browse files
committed
revised includes to update support for v1.3.3
1 parent 57b8346 commit f86fdd9

File tree

2 files changed

+10
-9
lines changed

2 files changed

+10
-9
lines changed

README.md

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Assuming you are running on port 8080, the code in `client.py` shows an example
4444
result = loads(server.parse("Hello world. It is so beautiful"))
4545
print "Result", result
4646

47-
That returns a dictionary containing the keys `sentences` and (when applicable) `corefs`. `sentences` are a list of dictionaries for each sentence, which contain `parsetree`, `text`, `tuples` containing the dependencies, and `words`, containing information about parts of speech, NER, etc:
47+
That returns a dictionary containing the keys `sentences` and (when applicable) `corefs`. The key `sentences` contains a list of dictionaries for each sentence, which contain `parsetree`, `text`, `tuples` containing the dependencies, and `words`, containing information about parts of speech, NER, etc:
4848

4949
{u'sentences': [{u'parsetree': u'(ROOT (S (VP (NP (INTJ (UH Hello)) (NP (NN world)))) (. !)))',
5050
u'text': u'Hello world!',
@@ -129,7 +129,7 @@ tar xvfz WNprolog-3.0.tar.gz
129129
**Stanford CoreNLP tools require a large amount of free memory**. Java 5+ uses about 50% more RAM on 64-bit machines than 32-bit machines. 32-bit machine users can lower the memory requirements by changing `-Xmx3g` to `-Xmx2g` or even less.
130130
If pexpect timesout while loading models, check to make sure you have enough memory and can run the server alone without your kernel killing the java process:
131131

132-
java -cp stanford-corenlp-2011-09-16.jar:stanford-corenlp-2011-09-14-models.jar:xom.jar:joda-time.jar -Xmx3g edu.stanford.nlp.pipeline.StanfordCoreNLP -props default.properties
132+
java -cp stanford-corenlp-2012-07-09.jar:stanford-corenlp-2012-07-06-models.jar:xom.jar:joda-time.jar -Xmx3g edu.stanford.nlp.pipeline.StanfordCoreNLP -props default.properties
133133

134134
You can reach me, Dustin Smith, by sending a message on GitHub or through email (contact information is available [on my webpage](http://web.media.mit.edu/~dustin)).
135135

@@ -138,13 +138,14 @@ You can reach me, Dustin Smith, by sending a message on GitHub or through email
138138

139139
This is free and open source software and has benefited from the contribution and feedback of others. Like Stanford's CoreNLP tools, it is covered under the [GNU General Public License v2 +](http://www.gnu.org/licenses/gpl-2.0.html), which in short means that modifications to this program must maintain the same free and open source distribution policy.
140140

141-
* Justin Cheng jcccf@221513ecf322dc32d6e088fb2f68751e45bac226
142-
* Abhaya Agarwal 8ed7640388cac8ba6d897739f5c8fe24eb87cc48
141+
This project has benefited from the contributions of:
143142

144-
## Similar Projects
143+
* @jcc Justin Cheng
144+
* Abhaya Agarwal
145145

146-
These two projects are python wrappers for the [Stanford Parser](http://nlp.stanford.edu/software/lex-parser.shtml), different than "core NLP tools":
146+
## Related Projects
147147

148+
These two projects are python wrappers for the [Stanford Parser](http://nlp.stanford.edu/software/lex-parser.shtml), which includes the Stanford Parser, although the Stanford Parser is another project.
148149
- [stanford-parser-python](http://projects.csail.mit.edu/spatial/Stanford_Parser) uses [JPype](http://jpype.sourceforge.net/) (interface to JVM)
149150
- [stanford-parser-jython](http://blog.gnucom.cc/2010/using-the-stanford-parser-with-jython/) uses Python
150151

corenlp.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -156,14 +156,14 @@ def __init__(self):
156156
Checks the location of the jar files.
157157
Spawns the server as a process.
158158
"""
159-
jars = ["stanford-corenlp-2012-04-09.jar",
160-
"stanford-corenlp-2012-04-09-models.jar",
159+
jars = ["stanford-corenlp-2012-07-09.jar",
160+
"stanford-corenlp-2012-07-06-models.jar",
161161
"joda-time.jar",
162162
"xom.jar"]
163163

164164
# if CoreNLP libraries are in a different directory,
165165
# change the corenlp_path variable to point to them
166-
corenlp_path = "stanford-corenlp-2012-04-09/"
166+
corenlp_path = "stanford-corenlp-2012-07-09/"
167167

168168
java_path = "java"
169169
classname = "edu.stanford.nlp.pipeline.StanfordCoreNLP"

0 commit comments

Comments
 (0)
0