-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Expose subgraph logs via subgraph GraphQL #6278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
fordN
wants to merge
11
commits into
master
Choose a base branch
from
ford/subgraph-logs-via-graphql
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
+4,881
−233
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
688827a to
120d61b
Compare
Introduces the foundation for the log store system with: - LogStore trait for querying logs from backends - LogLevel enum with FromStr trait implementation - LogEntry and LogQuery types for structured log data - LogStoreFactory for creating backend instances - NoOpLogStore as default (disabled) implementation
Implements three log storage backends for querying logs: - FileLogStore: Streams JSON Lines files with bounded memory usage - ElasticsearchLogStore: Queries Elasticsearch indices with full-text search - LokiLogStore: Queries Grafana Loki using LogQL All backends implement the LogStore trait and support: - Filtering by log level, timestamp range, and text search - Pagination via first/skip parameters - Returning structured LogEntry objects Dependencies added: reqwest, serde_json for HTTP clients.
Implements slog drains for capturing and writing logs: - FileDrain: Writes logs to JSON Lines files (one file per subgraph) - LokiDrain: Writes logs to Grafana Loki via HTTP push API Both drains: - Capture structured log entries with metadata (module, line, column) - Format logs with timestamp, level, text, and arguments - Use efficient serialization with custom KVSerializers
Adds a configuration layer for selecting and configuring log backends: - LogStoreConfig enum with variants: Disabled, File, Elasticsearch, Loki - LogConfigProvider for loading config from environment variables and CLI args - Unified GRAPH_LOG_STORE_* environment variable naming - CLI arguments with --log-store-backend and backend-specific options - Configuration precedence: CLI args > env vars > defaults - Deprecation warnings for old config variables Supported configuration: - Backend selection (disabled, file, elasticsearch, loki) - File: directory, max size, retention days - Elasticsearch: endpoint, credentials, index, timeout - Loki: endpoint, tenant ID
Refactors LoggerFactory to use LogStoreConfig instead of elastic-only: - Replaced elastic_config with log_store_config parameter - Build ElasticLoggingConfig on-demand from LogStoreConfig::Elasticsearch - Support all log drain types (File, Loki, Elasticsearch) - Maintain backward compatibility with existing elastic configuration This enables the factory to create drains for any configured backend while preserving the existing component logger patterns.
Adds GraphQL API for querying subgraph logs: Schema types: - LogLevel enum (CRITICAL, ERROR, WARNING, INFO, DEBUG) - _Log_ type with id, timestamp, level, text, arguments, meta - _LogArgument_ type for structured key-value pairs - _LogMeta_ type for source location (module, line, column) Query field (_logs) with filters: - level: Filter by log level - from/to: Timestamp range (ISO 8601) - search: Text search in log messages - first/skip: Pagination (max 1000, skip max 10000)
Integrates _logs query into the GraphQL execution pipeline: Execution layer: - Execute _logs queries via log_store.query_logs() - Convert LogEntry results to GraphQL response objects - Handle log store errors gracefully Query parsing: - Recognize _logs as special query field - Build LogQuery from GraphQL arguments - Pass log_store to execution context Service wiring: - Create log store from configuration in launcher - Provide log store to GraphQL runner - Use NoOpLogStore in test environments This completes the read path from GraphQL query to log storage backend.
Adds comprehensive integration test for _logs query: Test implementation: - Deploys logs-query subgraph and waits for sync - Triggers contract events to generate logs - Queries _logs field with various filters - Verifies log entries are returned correctly - Tests filtering by level and text search
120d61b to
ee0f228
Compare
- Create graph/src/log/common.rs for common log drain functionality - SimpleKVSerializer: Concatenates KV pairs to strings - VecKVSerializer: Collects KV pairs into Vec<(String, String)> - HashMapKVSerializer: Collects KV pairs into HashMap - LogMeta: Shared metadata structure (module, line, column) - LogEntryBuilder: Builder for common log entry fields - level_to_str(): Converts slog::Level to string - create_async_logger(): Consistent async logger creation - Updated FileDrain, LokiDrain, and ElasticDrain to use the log common utilities
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR introduces a more flexible subgraph log storage and querying system for Graph Node and enables subgraph logs to be queried through the GraphQL subgraph query API. The implementation supports multiple log storage backends (File, Elasticsearch, and Loki) with a consistent query interface exposed to users in the subgraph's GraphQL schema.
What's new
GraphQL Query API
Storage Backends
Architecture
Examples
Querying logs
{ _logs( level: ERROR search: "timeout" from: "2024-01-15T00:00:00Z" to: "2024-01-16T00:00:00Z" first: 100 ) { id timestamp level text arguments { key value } meta { module line column } } }Configuring the logs store backend
File-based (development):
Loki (production):