A Python package that processes text input describing volunteer activities in conflict zones, such as reports of volunteers physically intervening between Israeli settlers and Palestinian villages. It uses an LLM to extract structured information like the location, number of volunteers, actions taken, and outcomes, ensuring the output is consistently formatted and validated through pattern matching for reliability in humanitarian or journalistic contexts.
pip install conflict_zone_extractorfrom conflict_zone_extractor import conflict_zone_extractor
response = conflict_zone_extractor(
user_input="Volunteers intervened in the West Bank today, with 15 people present.",
api_key="your_api_key" # Optional, if not provided, the package will use the default LLM7
)
print(response)You can use any LLM compatible with LangChain. Here are examples with different LLMs:
from langchain_openai import ChatOpenAI
from conflict_zone_extractor import conflict_zone_extractor
llm = ChatOpenAI()
response = conflict_zone_extractor(
user_input="Volunteers intervened in the West Bank today, with 15 people present.",
llm=llm
)
print(response)from langchain_anthropic import ChatAnthropic
from conflict_zone_extractor import conflict_zone_extractor
llm = ChatAnthropic()
response = conflict_zone_extractor(
user_input="Volunteers intervened in the West Bank today, with 15 people present.",
llm=llm
)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from conflict_zone_extractor import conflict_zone_extractor
llm = ChatGoogleGenerativeAI()
response = conflict_zone_extractor(
user_input="Volunteers intervened in the West Bank today, with 15 people present.",
llm=llm
)
print(response)user_input(str): The user input text to process.llm(Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the defaultChatLLM7will be used.api_key(Optional[str]): The API key for LLM7. If not provided, the package will use the default LLM7 or the API key from the environment variableLLM7_API_KEY.
By default, the package uses ChatLLM7 from langchain_llm7. You can safely pass your own LLM instance if you want to use another LLM.
The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you want higher rate limits for LLM7, you can pass your own API key via the environment variable LLM7_API_KEY or directly via the api_key parameter. You can get a free API key by registering at LLM7.
If you encounter any issues, please report them on the GitHub issues page.
- Eugene Evstafev
- Email: hi@eugene.plus
- GitHub: chigwell