|
| 1 | +# ScrapeGraph SDKs |
| 2 | + |
| 3 | +Official SDKs for interacting with the ScrapeGraph AI API - a powerful web scraping and data extraction service. |
| 4 | + |
| 5 | +## Available SDKs |
| 6 | + |
| 7 | +- [Python SDK (scrapegraph-py)](#python-sdk) |
| 8 | +- [JavaScript SDK (scrapegraph-js)](#javascript-sdk) |
| 9 | + |
| 10 | +## Python SDK |
| 11 | + |
| 12 | +### Installation |
| 13 | +bash |
| 14 | +pip install scrapegraph-py |
| 15 | +### Features |
| 16 | + |
| 17 | +- Web Scraping (basic and structured) |
| 18 | +- Credits checking |
| 19 | +- Feedback submission |
| 20 | +- API status checking |
| 21 | +- Local HTML scraping support |
| 22 | +- Pydantic schema integration |
| 23 | + |
| 24 | +### Basic Usage |
| 25 | +python |
| 26 | +from scrapegraph_py import ScrapeGraphClient, smart_scraper |
| 27 | +from dotenv import load_dotenv |
| 28 | +import os |
| 29 | +load_dotenv() |
| 30 | +api_key = os.getenv("SCRAPEGRAPH_API_KEY") |
| 31 | +client = ScrapeGraphClient(api_key) |
| 32 | +url = "https://example.com" |
| 33 | +prompt = "What does the company do?" |
| 34 | +result = smart_scraper(client, url, prompt) |
| 35 | +print(result) |
| 36 | +### Structured Data with Schema |
| 37 | +python |
| 38 | +from pydantic import BaseModel, Field |
| 39 | +class CompanyInfoSchema(BaseModel): |
| 40 | +company_name: str = Field(description="The name of the company") |
| 41 | +description: str = Field(description="A description of the company") |
| 42 | +main_products: list[str] = Field(description="The main products of the company") |
| 43 | +result = smart_scraper( |
| 44 | +client=client, |
| 45 | +url="https://example.com", |
| 46 | +prompt="Extract company information", |
| 47 | +schema=CompanyInfoSchema |
| 48 | +) |
| 49 | +bash |
| 50 | +npm install scrapegraph-js |
| 51 | +### Features |
| 52 | + |
| 53 | +- Smart web scraping |
| 54 | +- Credits management |
| 55 | +- Feedback submission |
| 56 | +- Schema-based extraction |
| 57 | +- Promise-based API |
| 58 | + |
| 59 | +### Basic Usage |
| 60 | +javascript |
| 61 | +import { smartScraper, credits, feedback } from 'scrapegraph-js'; |
| 62 | +const apiKey = process.env.SCRAPEGRAPH_API_KEY; |
| 63 | +const url = 'https://example.com'; |
| 64 | +// Basic scraping |
| 65 | +const result = await smartScraper(apiKey, url, "What does the company do?"); |
| 66 | +console.log(JSON.parse(result)); |
| 67 | +// Check credits |
| 68 | +const creditsInfo = await credits(apiKey); |
| 69 | +console.log(JSON.parse(creditsInfo)); |
| 70 | +### Schema-based Extraction |
| 71 | +avascript |
| 72 | +const schema = { |
| 73 | +title: "CompanyInfo", |
| 74 | +properties: { |
| 75 | +company_name: { type: "string", description: "The name of the company" }, |
| 76 | +description: { type: "string", description: "A description of the company" }, |
| 77 | +main_products: { |
| 78 | +type: "array", |
| 79 | +items: { type: "string" }, |
| 80 | +description: "The main products of the company" |
| 81 | +} |
| 82 | +}, |
| 83 | +required: ["company_name", "description"] |
| 84 | +}; |
| 85 | +const result = await smartScraper(apiKey, url, "Extract company information", schema); |
| 86 | +## Authentication |
| 87 | + |
| 88 | +Both SDKs support authentication via API key. We recommend storing your API key in environment variables: |
| 89 | +bash |
| 90 | +For Python |
| 91 | +export SCRAPEGRAPH_API_KEY="your-api-key-here" |
| 92 | +For Node.js |
| 93 | +export SCRAPEGRAPH_API_KEY="your-api-key-here" |
| 94 | +Or using a `.env` file: |
| 95 | +plaintext |
| 96 | +SCRAPEGRAPH_API_KEY="your-api-key-here" |
| 97 | + |
| 98 | +## Error Handling |
| 99 | + |
| 100 | +Both SDKs provide consistent error handling |
| 101 | +json |
| 102 | +{ |
| 103 | +"error": "HTTP error occurred", |
| 104 | +"message": "Error details", |
| 105 | +"status_code": 400 |
| 106 | +} |
| 107 | + |
| 108 | +## Development |
| 109 | + |
| 110 | +### Python SDK Requirements |
| 111 | +- Python 3.9+ |
| 112 | +- [Rye](https://rye-up.com/) for dependency management (optional) |
| 113 | + |
| 114 | +### JavaScript SDK Requirements |
| 115 | +- Node.js 14+ |
| 116 | +- npm or yarn |
| 117 | + |
| 118 | +## Contributing |
| 119 | + |
| 120 | +We welcome contributions to both SDKs! Please check our [Contributing Guidelines](CONTRIBUTING.md) for more information. |
| 121 | + |
| 122 | +## License |
| 123 | + |
| 124 | +Both SDKs are licensed under the MIT License. |
| 125 | + |
| 126 | +## Support |
| 127 | + |
| 128 | +For support: |
| 129 | +- Visit [ScrapeGraph AI Documentation](https://sgai-api.onrender.com/docs) |
| 130 | +- Check the examples in the respective SDK's examples directory |
| 131 | +- Contact our support team |
| 132 | + |
| 133 | +## Links |
| 134 | + |
| 135 | +- [Python SDK Documentation](https://github.com/ScrapeGraphAI/scrapegraph-sdk/tree/main/scrapegraph-py) |
| 136 | +- [JavaScript SDK Documentation](https://github.com/ScrapeGraphAI/scrapegraph-sdk/tree/main/scrapegraph-js) |
| 137 | +This README combines information from both SDKs and provides a comprehensive overview of their features and usage. I referenced the following code blocks for accuracy: |
0 commit comments