New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

Study Guide

by: Abhishek Notetaker

Study Guide CSCI GA-2590

Abhishek Notetaker
GPA 3.7

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Study Guide
Natural Language Processing
Dr. Ralph Grishman
Study Guide
NLP Exam Study Guide
50 ?




Popular in Natural Language Processing

Popular in ComputerScienence

This 3 page Study Guide was uploaded by Abhishek Notetaker on Sunday March 6, 2016. The Study Guide belongs to CSCI GA-2590 at NYU School of Medicine taught by Dr. Ralph Grishman in Fall 2016. Since its upload, it has received 26 views. For similar materials see Natural Language Processing in ComputerScienence at NYU School of Medicine.


Reviews for Study Guide


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 03/06/16
The final exam • Worth 30 points towards final grade • Given Friday, May 15 , 2016, 5:10 -- 7:00 • Closed book but • You may bring one or two sheets of notes, double sided, with your name on each sheet. (This may be convenient for definitions or formulas, for example.) • You may bring a simple calculator, which may be helpful for questions on HMM or PCFG probabilities or word similarities. No other electronic equipment is permitted. • Approximately 8 to 10 questions Most questions will be of one of the following types. Ma ny of these correspond directly to questions asked for homework. In addition, there may be a few short answer questions corresponding to major points of a lecture; some of these are included in the list below. I may also ask a short (few sentence) essay qu estion about an issue we have discussed in the lectures. 1 Bag of Words Methods -- Information Retrieval: Compute the cosine similarity between two documents, with and without TF-IDF weighting (lecture #2 and homework #1) 2 Bag of Words Methods -- Sentiment Analysis: Given a very small corpus with binary labels (good / bad), use Naive Bayes to label additional documents. Explain through an example the need for (Laplace) smoothing (lecture #2 and homework #1) 3 English sentence structure and context -free grammar: Label the constituents (NP, VP, PP, etc.) of an English sentence based on the grammar given in Chapter #12 (and summarized in the handout for homework #2). If the sentence is ambiguous, show its multiple parses. If the sentence violates some grammatical constraint, describe the constraint. Extend the context-free grammar to cover an additional construct, or to capture a grammatical constraint. (lecture #3, homework #2). 4 Parsing: Given a very small context-free grammar, step through the operation of a top-down backtracking parser or a bottom- up (CKY) parser. What is the [time] complexity of these parsers? Convert the constituent structure into a dependency structure.(lecture #3) 5 POS tagging: Tag a sentence using the Penn POS tags (lecture #4). 6 HMMs and the Viterbi decoder: Describe how POS tagging can be performed using a probabilistic model (J&M sec. 5.5 and chap 6; lecture #4). Create an HMM from some POS - tagged training data. Trace the operation of a Viterbi decoder. Compute the likelihood of a given tag sequence and the likelihood of generating a given sentence from an HMM. What is the [time] complexity of the decoder with respect to sentence length and number of POS? (lecture #4 and homework#3) 7 Chunkers and name taggers. Explain how BIO tags can be used to reduce chunking or name identification to a token - tagging task. Explain how chunking can be evaluated (compute recall, precision, and F-measure for an example) (lecture #5). 8 Maximum entropy: Explain how a maximum-entropy model can be used for tagging or chunking (lecture #6 and homework #6). Suggest some suitable features for each task. 9 Jet: be able to extend, or trace the operation, of one of the Jet pattern sets we have distributed and discussed (for noun and verb groups, and for appointment events). Analyze and correct a shortcoming in the appointment patterns (homework #7). 10 Lexical semantics and word sense disambiguation : given two words, state their semantic relationship; given a word with two senses and a small training set of contexts for each of the two senses, apply the naive Bayes procedure to resolve the sense of the word in a test case (J&M 20.2.2); given two words and a few sentences containing them, compute their cosine similarity (lecture #8). 11 Learning: be able to briefly define supervised, semi- supervised, unsupervised, and active learning procedures (lecture #8). 12 Reference resolution: analyze a reference resolution problem (within a document or cross-document) -- identify the type of anaphora and the constraints and preferences which would lead a system to select the correct antecedent (lecture #10/11). 13 Probabilistic CFG: Train a probabilistic CFG from some parses; apply this PCFG to disambiguate a sentence. Explain how this PCFG can be extended to capture lexical information. Compute lexically-conditioned probabilities. (lecture #11/12; homework #8) Machine translation.Give the basic formula for noisy channel translation. Explain how an n-gram language model can be computed. What assumption is made by IBM Model 1 (lecture #13)?


Buy Material

Are you sure you want to buy this material for

50 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Allison Fischer University of Alabama

"I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over $600 per month. I LOVE StudySoup!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"


"Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.