Welcome, fellow creators and explorers of the digital frontier! 🌟
Today, we’re diving deep into the exciting realm of artificial intelligence . Our destination: crafting an attention-grabbing Twitter thread infused with the power of LangChain and the mighty GPT-3.5 Turbo.
In this comprehensive guide, we’ll walk you through each stage of the process. From installation to weaving captivating content, you’ll gain the skills and insights needed to craft compelling narratives that resonate with your audience.
Let’s embark on this journey together and unlock the potential of AI-powered storytelling!
We will be using Python(via Google Colab) for this.
And for APIs you need Open API key and Serper Api Key
Step 1: Setting Up the Stage – Prerequisites and Installation
Before the spotlight shines on our AI-powered Twitter thread, let’s ensure you’re ready for the show:
- Access Google Colab: If you haven’t yet, navigate to Google Colab.
- Sign In: Log in using your Google account credentials.
- Create a New Notebook: Click “New Notebook” to create a fresh canvas for our masterpiece.
Now that you’re in the spotlight of Colab, let’s gather the tools we need. In a code cell, enter these commands:
!pip install openai langchain playwright beautifulsoup4 unstructured[local-inference] requests
These commands lay the foundation for our AI-driven creation.
Then import the os and set the openAI environment variable
import os
os.environ["OPENAI_API_KEY"] = "YOUR API KEY"
Step 2: Preparing the Script – Google Search with AI
Our creative journey kicks off with using AI to explore the vast digital landscape of Google. Introducing the protagonist of this step: the search(query) function.
import json
import requests
def search(query):
# API endpoint for Google search
url = "https://google.serper.dev/search"
# Preparing the request payload
payload = json.dumps({
"q": query
})
headers = {
'X-API-KEY': 'YOUR_API_KEY',
'Content-Type': 'application/json'
}
# Sending the POST request and capturing the response
response = requests.request("POST", url, headers=headers, data=payload)
response_data = response.json()
print("search results:", response_data)
return response_data
Explanation:
- We import
jsonto handle JSON data andrequestsfor making HTTP requests. - The
searchfunction takes a query as input, defining the topic you’re researching. - We define the API endpoint of the Serper API, which acts as our gateway to Google search.
- A JSON payload is prepared with your query.
- We set headers, including the API key for authentication.
- Using the
requestslibrary, we send a POST request with the payload and headers. - The response data, received as JSON, is printed and returned.
Example Usage: If you’re inquiring about “AI in education,” invoke the function like this:
search_result = search("AI in education")
Stay with us as we reveal the next steps in this AI-powered expedition!
Step 3: Unearthing Hidden Treasures – Extracting Best Article URLs
Having set sail and cast our AI-powered net, it’s time to uncover the gems within the search results. Behold, the function find_best_article_urls(response_data, query) takes center stage:
from langchain import OpenAI, LLMChain, PromptTemplate
def find_best_article_urls(response_data, query):
response_str = json.dumps(response_data)
llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=.7)
template = """
Imagine you're a world-class journalist and researcher. Your expertise lies in finding the most relevant articles on specific topics. Here are the search results for the query "{query}":
{response_str}
Your task: Select the top 3 articles from the list and provide ONLY an array of URLs. Exclude any additional content.
"""
prompt_template = PromptTemplate(
input_variables=["response_str", "query"], template=template
)
article_picker_chain = LLMChain(
llm=llm, prompt=prompt_template, verbose=True
)
urls = article_picker_chain.predict(response_str=response_str, query=query)
url_list = json.loads(urls)
print(url_list)
return url_list
Explanation:
- Import the necessary modules, including
OpenAI,LLMChain, andPromptTemplatefromlangchain. - Our function,
find_best_article_urls, takes the response data and query as inputs. - We convert the response data to a string format for LangChain compatibility.
- Set up the LangChain model, utilizing the mighty GPT-3.5 Turbo.
- Craft a dynamic template guiding the AI through the selection process.
- Instantiate the prompt template using
PromptTemplate. - Create a LangChain instance for this task, using
article_picker_chain. - Utilize LangChain to predict and extract the best article URLs from the response data.
Example Usage: Upon obtaining the search results, you can invoke this function as follows:
best_urls = find_best_article_urls(search_result, "AI in education")
Step 4: Unveiling Captivating Insights – Content Extraction
With URLs in hand, we embark on an expedition to extract captivating content. Enter the get_content_from_urls(urls) function:
from langchain.document_loaders import UnstructuredURLLoader
def get_content_from_urls(urls):
loader = UnstructuredURLLoader(urls=urls)
data = loader.load()
print(data)
return data
Explanation:
- Import the
UnstructuredURLLoaderfromlangchain.document_loaders. - The function
get_content_from_urlstakes the list of URLs as input. - We create a document loader instance using the provided URLs.
- The
loader.load()method extracts and aggregates content from the URLs.
Example Usage: You can utilize this function to extract content from the obtained URLs:
extracted_content = get_content_from_urls(best_urls)
Step 5: Crafting a Compelling Narrative – Summarization with LangChain
As we voyage deeper, we must master the art of summarization. Behold, the function summary(data, query) comes into play:
from langchain.text_splitter import CharacterTextSplitter
from langchain.chat_models import ChatOpenAI
from langchain.chains.llm import LLMChain
from langchain.prompts import PromptTemplate
def summary(data, query):
text_splitter = CharacterTextSplitter(
separator="\n\n",
chunk_size=3000,
chunk_overlap=200,
length_function=len,
)
text = text_splitter.split_documents(data)
llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0.7)
template = """
{text}
Imagine you're a world-class journalist tasked with summarizing the text above to create a captivating Twitter thread about "{query}". Adhere to these guidelines:
1/ Make the content engaging, informative, and backed by solid data.
2/ Limit the thread to 3-5 tweets.
3/ Address the core topic of "{query}" comprehensively.
4/ Aim for virality, targeting at least 1660 likes.
5/ Ensure readability and clarity.
6/ Provide actionable insights.
SUMMARY:
"""
prompt_template = PromptTemplate(
input_variables=["text", "query"], template=template
)
summariser_chain = LLMChain(
llm=llm, prompt=prompt_template, verbose=True
)
summaries = []
for chunk in text:
summary = summariser_chain.predict(text=chunk, query=query)
summaries.append(summary)
print(summaries)
return summaries
Explanation:
- Import the necessary modules and classes, including
CharacterTextSplitter,ChatOpenAI,LLMChain, andPromptTemplatefromlangchain. - The function
summaryaccepts the extracted content data and query as inputs. - We split the content into manageable chunks using
CharacterTextSplitter. - Set up LangChain using GPT-3.5 Turbo and create a dynamic template for summarization.
- Create an instance of
PromptTemplate. - Instantiate the LangChain summarization model using
summariser_chain. - Loop through the content chunks, utilizing LangChain to generate summaries.
Example Usage: Once you have the extracted content, you can invoke this function as follows:
thread_summaries = summary(extracted_content, "AI in education")
Step 6: Weaving the Threads of Wisdom – Crafting a Twitter Thread
With summaries in hand, it’s time to craft a captivating Twitter thread. Enter the function write_twitter_thread(summaries, query):
def write_twitter_thread(summaries, query):
summaries_str = str(summaries)
llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0.7)
template = """
{summaries_str}
Imagine you're a world-class journalist and a Twitter influencer. You have the context above about "{query}". Craft a viral Twitter thread about "{query}" while adhering to these rules:
1/ Make the thread engaging, data-rich, and informative.
2/ Keep the thread concise, within 3-5 tweets.
3/ Thoroughly cover the topic of "{query}".
4/ Strive for virality with at least 1000 likes.
5/ Write in an accessible and understandable manner.
6/ Offer actionable insights to the audience.
TWITTER THREAD:
"""
prompt_template = PromptTemplate(
input_variables=["summaries_str"], template=template
)
twitter_thread_chain = LLMChain(
llm=llm, prompt=prompt_template, verbose=True
)
twitter_thread = twitter_thread_chain.predict(summaries_str=summaries_str, query=query)
return twitter_thread
Explanation:
- The function
write_twitter_threadtakes summaries and the query as inputs. - We set up LangChain with GPT-3.5 Turbo and create a dynamic template for crafting the Twitter thread.
- Instantiate an instance of
PromptTemplate. - Create a LangChain model for crafting the Twitter thread using
twitter_thread_chain. - Utilize LangChain to generate a Twitter thread based on the provided summaries.
Example Usage: To craft a Twitter thread based on the generated summaries, use this function:
crafted_thread = write_twitter_thread(thread_summaries, "AI in education")
You’ve traversed the full journey of extracting insights and crafting engaging narratives using AI!
Conclusion: Setting Sail for AI-Powered Creativity
Congratulations, intrepid explorers! 🚀 You’ve successfully embarked on a voyage that has led you through the intricate waters of AI-powered content creation. From utilizing AI for Google searches to crafting captivating Twitter threads, you’ve harnessed the power of LangChain and GPT-3.5 Turbo to create compelling narratives. Armed with these skills, you’re now ready to infuse your projects and content with a touch of AI magic. As you continue on your journey, remember that the seas of innovation are boundless, and the possibilities are limitless. Set sail and create with the winds of AI-powered creativity at your back!
Stay curious, and may your AI-powered endeavors lead you to new horizons. Until next time, happy coding and creating! 🌟🛠️



Leave a comment