Dominic Nidy

domnidy

AI & ML interests

None yet

Recent Activity

reacted to Kseniase's post with 👍 8 days ago
9 types of "Chain-of-..." approaches: Chain-of-Thought (CoT) prompting enhances reasoning in AI models by breaking down complex problems into step-by-step logical sequences. It continues proving its effectiveness, especially in top-performing reasoning models. However, there are other similar methods, that expand CoT and can be used for different purposes. Here are 9 of them: 1. Chain-of-Action-Thought (COAT) -> https://huggingface.co/papers/2502.02508 Helps model decide when to keep thinking, double-check their work, or try a different approach, using special guiding tokens. 2. Chain of Draft (CoD) -> https://huggingface.co/papers/2502.18600 It helps model generate short but meaningful reasoning steps, cutting costs and making processing faster 3. Chain-of-Agents -> https://huggingface.co/papers/2406.02818 Uses multi-agent collaboration: Worker agents process text parts in a structured chain, and manager agent summarizes the results 4. Chain-of-RAG ->https://huggingface.co/papers/2501.14342 Creates retrieval chains, instead of retrieving all info at once. It can dynamically adjust its search process and its parameters like step number 5. Chain-of-Shot Prompting (CoS) -> https://huggingface.co/papers/2502.06428 Helps models pick frames crucial for understanding a video, using a binary video summary and video co-reasoning module. 6. Chain of Hindsight (CoH) -> https://huggingface.co/papers/2302.02676 Converts all feedback into sequences to fine-tune the model and refine outputs 7. Chain-of-Note (CoN) -> https://huggingface.co/papers/2311.09210 Generates sequential reading notes for each retrieved document to assess relevance before integrating info into the final answer 8. Chain of Diagnosis (CoD) -> https://huggingface.co/papers/2407.13301 Transforms the diagnostic process into a diagnostic chain 9. Chain(s)-of-Knowledge -> https://www.turingpost.com/p/cok Enhance LLMs by dynamically pulling in external knowledge to improve accuracy and reduce errors
reacted to singhsidhukuldeep's post with 👍 8 days ago
Exciting New Tool for Knowledge Graph Extraction from Plain Text! I just came across a groundbreaking new tool called KGGen that's solving a major challenge in the AI world - the scarcity of high-quality knowledge graph data. KGGen is an open-source Python package that leverages language models to extract knowledge graphs (KGs) from plain text. What makes it special is its innovative approach to clustering related entities, which significantly reduces sparsity in the extracted KGs. The technical approach is fascinating: 1. KGGen uses a multi-stage process involving an LLM (GPT-4o in their implementation) to extract entities and relations from source text 2. It aggregates graphs across sources to reduce redundancy 3. Most importantly, it applies iterative LM-based clustering to refine the raw graph The clustering stage is particularly innovative - it identifies which nodes and edges refer to the same underlying entities or concepts. This normalizes variations in tense, plurality, stemming, and capitalization (e.g., "labors" clustered with "labor"). The researchers from Stanford and University of Toronto also introduced MINE (Measure of Information in Nodes and Edges), the first benchmark for evaluating KG extractors. When tested against existing methods like OpenIE and GraphRAG, KGGen outperformed them by up to 18%. For anyone working with knowledge graphs, RAG systems, or KG embeddings, this tool addresses the fundamental challenge of data scarcity that's been holding back progress in graph-based foundation models. The package is available via pip install kg-gen, making it accessible to everyone. This could be a game-changer for knowledge graph applications!
liked a model 8 days ago
qihoo360/TinyR1-32B-Preview
View all activity

Organizations

None yet