Careertail
About UsCoursesCareer PathsBlogOpportunities
Log In
Courses>Programming Languages>Natural Language Generation in Python
Data ScienceNatural Language Generation in Python
Price:Paid
Length:4 Hours
Language:English
Content type:video
level:advanced
Updated:19 February 2024
Published:31 August 2022
Similar courses
Opportunities
Courses>Programming Languages>Natural Language Generation in Python
Natural Language Generation in Python
4 Hours
3840 students
Syllabus
The order of words in sentences is important (unless Yoda you are called). That’s why in this chapter, you’ll learn how to represent your data sequentially and use neural network architecture to model your text data. You'll learn how to create and train a recurrent network to generate new text, character by character. You'll also use the names dataset to build your own baby name generator, using a very simple recurrent neural network and the Keras package.
In this chapter, you’ll find out how to overcome the limitations of recurrent neural networks when input sequences span long intervals. To avoid vanishing and exploding gradient problems you'll be introduced to long short term memory (LSTM) networks that are more effective when working with long-term dependencies. You'll work on a fun project where you'll build and train a simple LSTM model using selected literary works of Shakespeare to generate new text in the unique writing style of Shakespeare.
In this chapter, you'll learn about the encoder-decoder architecture and how it can be used to model sequence-to-sequence datasets, converting information from one domain to another domain. You'll use this knowledge to build a model for neural machine translation, training your model to translate English sentences into French.
In this chapter, you'll build your very own machine learning seq2seq model. You'll use real-world messages from the Enron email dataset to train an encoder-decoder model. Using this you’ll predict the correct ending for an incomplete input sentence.
Similar courses
Opportunities
Make the most out of your online education
Careertail
Copyright © 2021 Careertail.
All rights reserved
Quick Links
Get StartedLog InAbout UsCourses
Company
BlogContactsPrivacy PolicyCookie PolicyTerms and Conditions
Stay up to date
Trustpilot
Careertail
Courses>Programming Languages>Natural Language Generation in Python
Data ScienceNatural Language Generation in Python
Price:Paid
Length:4 Hours
Language:English
Content type:video
level:advanced
Updated:19 February 2024
Published:31 August 2022
Similar courses
Opportunities
Courses>Programming Languages>Natural Language Generation in Python
Natural Language Generation in Python
4 Hours
3840 students
Syllabus
The order of words in sentences is important (unless Yoda you are called). That’s why in this chapter, you’ll learn how to represent your data sequentially and use neural network architecture to model your text data. You'll learn how to create and train a recurrent network to generate new text, character by character. You'll also use the names dataset to build your own baby name generator, using a very simple recurrent neural network and the Keras package.
In this chapter, you’ll find out how to overcome the limitations of recurrent neural networks when input sequences span long intervals. To avoid vanishing and exploding gradient problems you'll be introduced to long short term memory (LSTM) networks that are more effective when working with long-term dependencies. You'll work on a fun project where you'll build and train a simple LSTM model using selected literary works of Shakespeare to generate new text in the unique writing style of Shakespeare.
In this chapter, you'll learn about the encoder-decoder architecture and how it can be used to model sequence-to-sequence datasets, converting information from one domain to another domain. You'll use this knowledge to build a model for neural machine translation, training your model to translate English sentences into French.
In this chapter, you'll build your very own machine learning seq2seq model. You'll use real-world messages from the Enron email dataset to train an encoder-decoder model. Using this you’ll predict the correct ending for an incomplete input sentence.
Similar courses
Opportunities
Make the most out of your online education
Careertail
Copyright © 2021 Careertail.
All rights reserved
Quick Links
Get StartedLog InAbout UsCourses
Company
BlogContactsPrivacy PolicyCookie PolicyTerms and Conditions
Stay up to date
Trustpilot