spec(wikibase-integration): use nodemw package for Wikibase integration#141
spec(wikibase-integration): use nodemw package for Wikibase integration#141
Conversation
📝 WalkthroughSummary by CodeRabbit
WalkthroughAdds three specification documents under .kiro/specs/mwn-wikibase-integration/: a design, requirements, and task plan for migrating Wikibase integration to a nodemw-based backend with new services, endpoints, caching, real-time validation, and corresponding frontend updates. No code changes or exported/public API alterations. Changes
Sequence Diagram(s)sequenceDiagram
participant UI as Frontend (Vue 3)
participant API as Backend (Elysia)
participant NWS as NodemwWikibaseService
participant PCS as PropertyCacheService
participant CVS as ConstraintValidationService
participant NMW as nodemw client
participant WB as Wikibase API
UI->>API: Search properties (q, options)
API->>PCS: Cache lookup
alt Cache hit
PCS-->>API: Cached results
else Cache miss
API->>NWS: searchProperties(q, options)
NWS->>NMW: mw.search (props)
NMW->>WB: action=query (search)
WB-->>NMW: results
NMW-->>NWS: results
NWS-->>API: results
API->>PCS: Store results
end
API-->>UI: Search results
UI->>API: Validate field against constraints
API->>CVS: validate(propertyId, value)
CVS->>NWS: getConstraints(propertyId)
NWS->>NMW: mw.getConstraints
NMW->>WB: action=wbgetentities (constraints)
WB-->>NMW: constraints
NMW-->>NWS: constraints
NWS-->>CVS: constraints
CVS-->>API: ValidationResult
API-->>UI: Validation feedback
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 5
🧹 Nitpick comments (12)
.kiro/specs/mwn-wikibase-integration/requirements.md (4)
23-31: Clarify “usage statistics” deliverable or remove from the user storyThe user story mentions “usage statistics,” but the acceptance criteria below do not reference usage stats. Either add acceptance criteria for usage stats (definition, source, format), or remove the term to keep scope tight.
Apply one of the following diffs to resolve the inconsistency:
Option A — add an acceptance criterion:
#### Acceptance Criteria @@ 2. WHEN property information is retrieved THEN the system SHALL use nodemw's API methods to fetch comprehensive property data including multilingual labels, descriptions, data types, and aliases 3. WHEN displaying property suggestions THEN the system SHALL use nodemw's search capabilities to provide autocomplete functionality with property ranking 4. WHEN property data is requested THEN the system SHALL leverage nodemw's built-in request queuing for optimal performance 5. IF property searches return large result sets THEN the system SHALL use nodemw's pagination handling to manage large responses efficiently + 6. WHEN usage statistics are requested THEN the system SHALL define the metric (e.g., statement count, sitelinks, inbound usage) and data source (e.g., WDQS or entity usage endpoints), and include them in the response where availableOption B — remove “usage statistics” from the story:
-**User Story:** As a schema editor user, I want to search and retrieve property information using nodemw's comprehensive MediaWiki API methods, so that I can access detailed property data including labels, descriptions, data types, and usage statistics with reliable performance. +**User Story:** As a schema editor user, I want to search and retrieve property information using nodemw's comprehensive MediaWiki API methods, so that I can access detailed property data including labels, descriptions, data types, and aliases with reliable performance.
39-44: Avoid attributing “claim parsing” to nodemw; define parsing in our code insteadNodemw returns API payloads but doesn’t provide high-level “claim parsing” semantics. Rephrase to indicate our service parses claims/constraints from API responses.
Apply:
-2. WHEN constraint data is retrieved THEN the system SHALL parse constraint types including format constraints, value type constraints, and allowed values using nodemw's claim parsing +2. WHEN constraint data is retrieved THEN the system SHALL parse constraint types (format, value type, allowed values, etc.) within our service from nodemw API responses @@ -5. IF constraint data is complex THEN the system SHALL use nodemw's claim parsing capabilities to handle qualifiers and references in constraints +5. IF constraint data is complex THEN the system SHALL handle qualifiers and references in constraints within our parsing logic using nodemw API responses
47-56: Temper statements about TypeScript support; nodemw typings are limitedNodemw does not ship comprehensive TS types. Plan to add custom type definitions or wrapper interfaces and avoid implying first-class TS coverage.
-1. WHEN implementing nodemw integration THEN the system SHALL use nodemw's TypeScript configuration and type definitions where available +1. WHEN implementing nodemw integration THEN the system SHALL use available TypeScript definitions and supplement with custom .d.ts or wrapper interfaces where needed @@ -4. WHEN extending functionality THEN the system SHALL leverage nodemw's direct API call capabilities for custom operations -5. IF custom API calls are needed THEN the system SHALL use nodemw's api.call method while maintaining consistent error handling +4. WHEN extending functionality THEN the system SHALL leverage nodemw's low-level API call capability (e.g., client.api.call) for custom operations +5. IF custom API calls are needed THEN the system SHALL use the low-level API and maintain consistent error handling and response typing
99-104: Be precise about nodemw concurrency/parallelism limitsNodemw provides throttling/queuing; parallelism is constrained. Clarify “parallel processing” as “batched/queued requests with configurable concurrency,” and ensure instance-safe queues.
-4. WHEN handling large datasets THEN the system SHALL use nodemw's parallel processing capabilities with configurable concurrency limits +4. WHEN handling large datasets THEN the system SHALL use nodemw's queued request processing with configurable per-instance concurrency limits.kiro/specs/mwn-wikibase-integration/design.md (3)
519-549: Add “maxlag” and User-Agent guidance under Security/Rate limitingMediaWiki requests should set maxlag and an identifying User-Agent. This helps avoid overloading servers and improves operator visibility.
2. **Rate Limiting**: - - Respect Wikibase instance rate limits - - Implement client-side throttling - - Queue management for high-volume operations + - Respect Wikibase instance rate limits and use `maxlag` parameter on API calls + - Implement client-side throttling and per-instance queues + - Identify requests with a descriptive User-Agent string per instance/config
554-560: Consider a feature-flagged migration and compatibility layerEven if breaking changes are acceptable, a feature flag or adapter eases rollout and rollback.
-### Direct Replacement Approach +### Direct Replacement Approach (with feature flags) @@ -1. **Replace WikibaseApiService**: Direct replacement with `NodemwWikibaseService` +1. **Replace WikibaseApiService**: Add `NodemwWikibaseService`, gated by a feature flag; maintain an adapter to fallback if needed during rollout
409-442: Update test doubles to stub existinggetArticleClaimsand low-levelclient.api.call
- The
nodemwclient already provides a high-level helpergetArticleClaims; you can continue mocking this method in your tests.- Since there are no built-in nodemw helpers for constraint parsing or other Wikibase modules, create and stub a thin wrapper around
client.api.call(e.g. foraction=wbgetclaims/wbgetentities) and feed it canned MediaWiki API payloads when testinggetPropertyConstraints..kiro/specs/mwn-wikibase-integration/tasks.md (5)
53-65: Add tasks for maxlag, retry/backoff, and per-instance throttlingOperational resilience should be explicit in the plan.
- [ ] 4. Implement property caching service for performance optimization @@ - [ ] 4.2 Integrate caching with NodemwWikibaseService - Add caching to property search and retrieval methods - Implement cache-first strategies with fallback to nodemw - Add cache warming for frequently accessed properties - _Requirements: 8.1, 8.2, 8.5_ + - [ ] 4.3 Add request policies + - Attach `maxlag` parameter to read requests + - Implement exponential backoff on RATE_LIMITED or maxlag responses + - Enforce per-instance concurrency limits and descriptive User-Agent headers
91-103: Frontend validation: specify debounced watch implementationEchoing the design comment, add a concrete task to implement debounced watchers using a utility (lodash or @vueuse).
- [ ] 6.2 Add real-time validation to property selection - Implement constraint checking when properties are selected - Add visual indicators for constraint compliance - Create validation feedback messages for property selection - _Requirements: 6.2, 6.3, 6.4_ + - Use debounced watchers (lodash-es or @vueuse/useDebounceFn) to limit API calls
143-155: Types: plan for custom interfaces instead of leaking nodemw typesMake explicit that we’ll define internal interfaces and adapters to avoid tight coupling to nodemw’s surface and to fill typing gaps.
- [ ] 10. Update type definitions and interfaces for nodemw integration - [ ] 10.1 Update backend type definitions - - Modify WikibaseInstanceConfig to include nodemw configuration + - Modify WikibaseInstanceConfig to include nodemw configuration - Update PropertyDetails to include constraint and usage information - Create new constraint and validation result type definitions + - Define internal client interfaces (e.g., INodemwClient) and adapters around nodemw - _Requirements: 4.1, 4.2, 4.4_
156-168: Error handling tasks: include MW API error mapping and maxlag handlingMap MW API error codes and implement retry guidance, not just generic network errors.
- [ ] 11. Implement comprehensive error handling for nodemw integration - [ ] 11.1 Create WikibaseErrorHandler service - - Map nodemw errors to application-specific error types - - Implement retry strategies for different error scenarios + - Map nodemw/network errors and MediaWiki API error codes (ratelimited, maxlag, assertuserfailed) to app-specific types + - Implement retry strategies with exponential backoff; respect Retry-After when present - Add graceful degradation for service failures - _Requirements: 1.3, 5.5, 8.5_
169-181: Testing: ensure non-destructive integration tests and secrets hygieneExplicitly state tests avoid writes, and secrets are injected via env with masking.
- [ ] 12. Add comprehensive testing for nodemw integration - [ ] 12.2 Create integration tests with real Wikibase instances - Test connectivity and data retrieval from Wikidata - Test constraint validation with real constraint data - Test performance and caching behavior under load + - Ensure tests are non-destructive (read-only) and do not perform write actions + - Manage credentials via environment variables with CI secret masking - _Requirements: All requirements - integration testing_
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (3)
.kiro/specs/mwn-wikibase-integration/design.md(1 hunks).kiro/specs/mwn-wikibase-integration/requirements.md(1 hunks).kiro/specs/mwn-wikibase-integration/tasks.md(1 hunks)
🧰 Additional context used
🪛 LanguageTool
.kiro/specs/mwn-wikibase-integration/requirements.md
[grammar] ~95-~95: There might be a mistake here.
Context: ...-in features, so that the schema editor provides responsive user experience while respec...
(QB_NEW_EN)
.kiro/specs/mwn-wikibase-integration/design.md
[grammar] ~51-~51: There might be a mistake here.
Context: ...er**: Replace WikibaseApiService with NodemwWikibaseService 2. Property Search: Enhance property sear...
(QB_NEW_EN)
[grammar] ~483-~483: There might be a mistake here.
Context: ...ng Strategy 1. Multi-level Caching: - L1: In-memory cache for frequently acces...
(QB_NEW_EN)
[grammar] ~484-~484: There might be a mistake here.
Context: ...cache for frequently accessed properties - L2: Redis cache for shared data across i...
(QB_NEW_EN)
[grammar] ~485-~485: There might be a mistake here.
Context: ...s cache for shared data across instances - L3: Database cache for persistent storag...
(QB_NEW_EN)
[grammar] ~488-~488: There might be a mistake here.
Context: ...tent storage 2. Cache Invalidation: - Time-based expiration (TTL) - Event-b...
(QB_NEW_EN)
[grammar] ~489-~489: There might be a mistake here.
Context: ...tion**: - Time-based expiration (TTL) - Event-based invalidation on schema chang...
(QB_NEW_EN)
[grammar] ~490-~490: There might be a mistake here.
Context: ...ent-based invalidation on schema changes - Manual cache clearing for administrators...
(QB_NEW_EN)
[grammar] ~493-~493: There might be a mistake here.
Context: ...nistrators 3. Request Optimization: - Batch property requests using nodemw's...
(QB_NEW_EN)
[grammar] ~494-~494: There might be a mistake here.
Context: ...y requests using nodemw's queue system - Debounced search requests to prevent API...
(QB_NEW_EN)
[grammar] ~495-~495: There might be a mistake here.
Context: ... search requests to prevent API flooding - Intelligent prefetching of related prope...
(QB_NEW_EN)
[grammar] ~523-~523: There might be a mistake here.
Context: ...horization 1. API Token Management: - Secure storage of Wikibase authenticatio...
(QB_NEW_EN)
[grammar] ~524-~524: There might be a mistake here.
Context: ...torage of Wikibase authentication tokens - Token rotation and expiration handling ...
(QB_NEW_EN)
[grammar] ~525-~525: There might be a mistake here.
Context: ...- Token rotation and expiration handling - Per-instance authentication configuratio...
(QB_NEW_EN)
[grammar] ~528-~528: There might be a mistake here.
Context: ...ion configuration 2. Rate Limiting: - Respect Wikibase instance rate limits ...
(QB_NEW_EN)
[grammar] ~533-~533: There might be a mistake here.
Context: ...lume operations 3. Data Validation: - Input sanitization for search queries ...
(QB_NEW_EN)
[grammar] ~534-~534: There might be a mistake here.
Context: ... - Input sanitization for search queries - Validation of property IDs and values ...
(QB_NEW_EN)
[grammar] ~535-~535: There might be a mistake here.
Context: ... - Validation of property IDs and values - Protection against injection attacks ##...
(QB_NEW_EN)
[grammar] ~540-~540: There might be a mistake here.
Context: ...Data Handling 1. Data Minimization: - Cache only necessary property informatio...
(QB_NEW_EN)
[grammar] ~545-~545: There might be a mistake here.
Context: ... for data storage 2. Audit Logging: - Log all API interactions for debugging ...
(QB_NEW_EN)
[grammar] ~556-~556: There might be a mistake here.
Context: ...seApiService**: Direct replacement with NodemwWikibaseService 2. Update API Endpoints: Modify existing ...
(QB_NEW_EN)
[grammar] ~565-~565: There might be a mistake here.
Context: ...ge 2. Replace WikibaseApiService with NodemwWikibaseService 3. Update existing API endpoints to use `no...
(QB_NEW_EN)
[grammar] ~566-~566: There might be a mistake here.
Context: ...3. Update existing API endpoints to use nodemw 4. Adapt existing type definitions #### Ph...
(QB_NEW_EN)
[grammar] ~569-~569: There might be a mistake here.
Context: ...nitions #### Phase 2: Enhanced Features 1. Add constraint validation service 2. Imp...
(QB_NEW_EN)
[grammar] ~575-~575: There might be a mistake here.
Context: ... #### Phase 3: Testing and Optimization 1. Add comprehensive test coverage 2. Imple...
(QB_NEW_EN)
.kiro/specs/mwn-wikibase-integration/tasks.md
[grammar] ~3-~3: There might be a mistake here.
Context: # Implementation Plan - [ ] 1. Set up nodemw package and core infrastructure ...
(QB_NEW_EN)
[grammar] ~34-~34: There might be a mistake here.
Context: ...Implement constraint validation service using nodemw Wikidata API - [ ] 3.1 Create ...
(QB_NEW_EN)
[grammar] ~37-~37: There might be a mistake here.
Context: ...ion - Implement constraint fetching using nodemw getArticleClaims method - Cr...
(QB_NEW_EN)
[grammar] ~53-~53: There might be a mistake here.
Context: ...Requirements: 3.3, 6.2, 6.3 - [ ] 4. Implement property caching service for performanc...
(QB_NEW_EN)
[grammar] ~54-~54: There might be a mistake here.
Context: ...or performance optimization - [ ] 4.1 Create PropertyCacheService class - Set up...
(QB_NEW_EN)
🔇 Additional comments (3)
.kiro/specs/mwn-wikibase-integration/requirements.md (1)
63-68: Confirm availability ofgetMediaWikiVersionor fall back toaction=query&meta=siteinfo&siprop=generalUpdate the acceptance criteria so that it clearly covers both cases:
- If the
nodemwclient exposes a dedicatedgetMediaWikiVersionmethod, use it.- Otherwise, invoke the MediaWiki API directly with
action=query&meta=siteinfo&siprop=generalto retrieve the version.- Amend the requirements text to guarantee a reliable version lookup in all environments.
Please verify in the
nodemwpackage documentation or TypeScript definitions whethergetMediaWikiVersionexists, and adjust the wording in.kiro/specs/mwn-wikibase-integration/requirements.mdaccordingly..kiro/specs/mwn-wikibase-integration/design.md (1)
386-401: Align FallbackService with the new cache.peek API and instance-aware cache keys
- Confirm that
cache.peek<T>(key)is implemented on the Cache interface.- Introduce or expose a consistent key-builder, e.g.
generateCacheKey(instanceId: string, operation: string, params: object): string.- Update
getPropertyWithFallbackto use the generated key andpeekinstead of inlining a raw string key.// Fall back to cache - const cached = await this.cache.get(`property:${instanceId}:${propertyId}`) + // Use instance-aware key generation and non-mutating peek + const key = this.cache.generateCacheKey(instanceId, 'getProperty', { propertyId }) + const cached = this.cache.peek<PropertyDetails>(key) if (cached) return cachedNote: If
generateCacheKeyorpeekaren’t yet public, please add a well-documented key-builder method to the Cache API..kiro/specs/mwn-wikibase-integration/tasks.md (1)
34-41: Leverage nodemw.getArticleClaims for Claims Retrievalnodemw’s Wikidata client already provides a
getArticleClaimshelper. Update your task spec accordingly:
- File:
.kiro/specs/mwn-wikibase-integration/tasks.md
Step 3.1:
- Replace “nodemw Wikidata client integration” with a call to
const wd = new Wikidata(); const claims = await wd.getArticleClaims(entityTitle);- Parse those claims for constraint logic.
- Reserve
client.api.call({ action: 'wbgetentities' | 'wbgetclaims', … })only when you need custom parameters, pagination, or other advanced options.
| export class NodemwWikibaseService { | ||
| private clients: Map<string, nodemw> = new Map() | ||
| private wikidataClients: Map<string, WikidataClient> = new Map() | ||
| private instances: Map<string, WikibaseInstanceConfig> = new Map() | ||
| private cache: PropertyCacheService | ||
| private validator: ConstraintValidationService | ||
|
|
||
| // Client management | ||
| createClient(instanceConfig: WikibaseInstanceConfig): nodemw | ||
| getClient(instanceId: string): nodemw | ||
| getWikidataClient(instanceId: string): WikidataClient | ||
|
|
||
| // Property operations | ||
| async searchProperties(instanceId: string, query: string, options: SearchOptions): Promise<PropertySearchResponse> | ||
| async getProperty(instanceId: string, propertyId: string): Promise<PropertyDetails> | ||
| async getPropertyConstraints(instanceId: string, propertyId: string): Promise<PropertyConstraint[]> | ||
|
|
||
| // Item ope | ||
| sync searchItems(instanceId: string, query: string, options: SearchOptions): Promise<ItemSearchResponse> | ||
| async getItem(instanceId: string, itemId: string): Promise<ItemDetails> | ||
|
|
||
| // Validation operations | ||
| async validatePropertyValue(instanceId: string, propertyId: string, value: any): Promise<ValidationResult> | ||
| async validateSchema(instanceId: string, schema: SchemaMapping): Promise<SchemaValidationResult> | ||
| } | ||
| ``` |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Fix method signature typos and unrealistic types for nodemw clients
- “// Item ope” and “sync searchItems” are typographical errors.
- Types “nodemw” and “WikidataClient” are not valid TypeScript types. Use a concrete client type or define interfaces/wrappers for the nodemw client(s).
export class NodemwWikibaseService {
- private clients: Map<string, nodemw> = new Map()
- private wikidataClients: Map<string, WikidataClient> = new Map()
+ // Prefer explicit interfaces to avoid leaking 3rd-party surface into our domain
+ private clients: Map<string, INodemwClient> = new Map()
+ private wikidataClients: Map<string, INodemwClient> = new Map()
private instances: Map<string, WikibaseInstanceConfig> = new Map()
private cache: PropertyCacheService
private validator: ConstraintValidationService
// Client management
- createClient(instanceConfig: WikibaseInstanceConfig): nodemw
- getClient(instanceId: string): nodemw
- getWikidataClient(instanceId: string): WikidataClient
+ createClient(instanceConfig: WikibaseInstanceConfig): INodemwClient
+ getClient(instanceId: string): INodemwClient
+ getWikidataClient(instanceId: string): INodemwClient
// Property operations
async searchProperties(instanceId: string, query: string, options: SearchOptions): Promise<PropertySearchResponse>
async getProperty(instanceId: string, propertyId: string): Promise<PropertyDetails>
async getPropertyConstraints(instanceId: string, propertyId: string): Promise<PropertyConstraint[]>
- // Item ope
-sync searchItems(instanceId: string, query: string, options: SearchOptions): Promise<ItemSearchResponse>
+ // Item operations
+ async searchItems(instanceId: string, query: string, options: SearchOptions): Promise<ItemSearchResponse>
async getItem(instanceId: string, itemId: string): Promise<ItemDetails>Outside this block, define a minimal client interface you own:
export interface INodemwClient {
apiCall<T>(params: Record<string, any>): Promise<T>; // wrapper around client.api.call
// add only what you use (e.g., search, getEntities) to avoid tight coupling
}🤖 Prompt for AI Agents
In .kiro/specs/mwn-wikibase-integration/design.md around lines 66 to 91, fix the
typographical errors and replace invalid TypeScript types: change the comment "
// Item ope" to " // Item operations" and correct "sync searchItems" to "async
searchItems"; replace the unrecognized types "nodemw" and "WikidataClient" with
concrete interfaces/wrappers (e.g., INodemwClient and IWikidataClient) and
update the class fields and method signatures to use those interfaces (clients:
Map<string, INodemwClient>, wikidataClients: Map<string, IWikidataClient>,
getClient/getWikidataClient return the interfaces, etc.); additionally, declare
minimal client interfaces outside this class (as suggested) exposing only the
methods you need (e.g., apiCall/search/getEntities) to avoid tight coupling.
| export class PropertyCacheService { | ||
| private cache: Map<string, CacheEntry> = new Map() | ||
| private ttlMap: Map<string, number> = new Map() | ||
|
|
||
| async get<T>(key: string, fetcher: () => Promise<T>, ttl?: number): Promise<T> | ||
| async invalidate(pattern: string): Promise<void> | ||
| async clear(instanceId?: string): Promise<void> | ||
|
|
||
| // Cache strategies | ||
| private generateCacheKey(instanceId: string, operation: string, params: any): string | ||
| private isExpired(entry: CacheEntry): boolean | ||
| private cleanup(): void | ||
| } |
There was a problem hiding this comment.
Cache API mismatch: FallbackService uses cache.get(key) without fetcher
PropertyCacheService.get requires (key, fetcher, ttl?), but FallbackService calls get(key) directly. Provide a non-fetching “peek” API or a getIfPresent.
export class PropertyCacheService {
private cache: Map<string, CacheEntry> = new Map()
private ttlMap: Map<string, number> = new Map()
async get<T>(key: string, fetcher: () => Promise<T>, ttl?: number): Promise<T>
+ // Returns cached value if present and not expired; does not invoke fetcher
+ peek<T>(key: string): T | undefined
async invalidate(pattern: string): Promise<void>
async clear(instanceId?: string): Promise<void>Then in FallbackService (below), use peek instead of get.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export class PropertyCacheService { | |
| private cache: Map<string, CacheEntry> = new Map() | |
| private ttlMap: Map<string, number> = new Map() | |
| async get<T>(key: string, fetcher: () => Promise<T>, ttl?: number): Promise<T> | |
| async invalidate(pattern: string): Promise<void> | |
| async clear(instanceId?: string): Promise<void> | |
| // Cache strategies | |
| private generateCacheKey(instanceId: string, operation: string, params: any): string | |
| private isExpired(entry: CacheEntry): boolean | |
| private cleanup(): void | |
| } | |
| export class PropertyCacheService { | |
| private cache: Map<string, CacheEntry> = new Map() | |
| private ttlMap: Map<string, number> = new Map() | |
| async get<T>(key: string, fetcher: () => Promise<T>, ttl?: number): Promise<T> | |
| // Returns cached value if present and not expired; does not invoke fetcher | |
| peek<T>(key: string): T | undefined | |
| async invalidate(pattern: string): Promise<void> | |
| async clear(instanceId?: string): Promise<void> | |
| // Cache strategies | |
| private generateCacheKey(instanceId: string, operation: string, params: any): string | |
| private isExpired(entry: CacheEntry): boolean | |
| private cleanup(): void | |
| } |
🤖 Prompt for AI Agents
.kiro/specs/mwn-wikibase-integration/design.md lines 98-110: the current
PropertyCacheService.get signature requires a fetcher but FallbackService calls
get(key) with no fetcher; add a non-fetching peek API (e.g. getIfPresent(key): T
| undefined or async getIfPresent(key): Promise<T | undefined>) that returns the
cached value without invoking a fetcher or changing TTL/entries, keep the
existing get(key, fetcher, ttl?) intact, implement getIfPresent to check the
cache map and expiry and return undefined for missing/expired entries, and
update FallbackService to call this new peek/getIfPresent method instead of
get(key).
| <template> | ||
| <div class="property-selector"> | ||
| <AutoComplete | ||
| v-model="selectedProperty" | ||
| :suggestions="propertySuggestions" | ||
| :loading="isSearching" | ||
| @complete="searchProperties" | ||
| @item-select="onPropertySelect" | ||
| /> | ||
| <div v-if="selectedProperty" class="property-details"> | ||
| <PropertyInfo :property="propertyDetails" /> | ||
| <ConstraintList :constraints="propertyConstraints" /> | ||
| </div> | ||
| </div> | ||
| </template> | ||
| ``` |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Vue watch debounce and variable name issue (“statements1”)
- Vue 3’s watch does not support a “debounce” option; use a debounced function or a composable (e.g., @vueuse/core useDebounceFn).
- “statements1” looks like a typo.
export const useSchemaStore = defineStore('schema', () => {
@@
- watch([statements1, labels, descriptions], async () => {
- if (isDirty.value) {
- await validateSchema()
- }
- }, { debounce: 500 })
+ const triggerValidation = useDebounceFn(async () => {
+ if (isDirty.value) await validateSchema()
+ }, 500)
+
+ watch([statements, labels, descriptions], () => {
+ triggerValidation()
+ })
})If avoiding external deps:
import { debounce } from 'lodash-es'
const triggerValidation = debounce(async () => { /* ... */ }, 500)🤖 Prompt for AI Agents
In .kiro/specs/mwn-wikibase-integration/design.md around lines 167 to 182, the
Vue watch usage incorrectly attempts to use a non-existent "debounce" option and
the variable name "statements1" appears to be a typo; replace the
watch-with-debounce pattern by creating a debounced callback (either via a small
local debounce wrapper or a composable such as @vueuse/core's useDebounceFn) and
call that debounced function from the watcher, and also rename "statements1" to
the intended variable name (e.g., "statements") everywhere it is referenced to
fix the typo and keep names consistent.
| // Enhanced error types | ||
| export type WikibaseError = | ||
| | 'CONNECTION_FAILED' | ||
| | 'AUTHENTICATION_FAILED' | ||
| | 'RATE_LIMITED' | ||
| | 'PROPERTY_NOT_FOUND' | ||
| | 'CONSTRAINT_VIOLATION' | ||
| | 'VALIDATION_FAILED' | ||
| | 'CACHE_ERROR' | ||
| | 'INSTANCE_UNAVAILABLE' | ||
|
|
||
| // Error handling service | ||
| export class WikibaseErrorHandler { | ||
| static handle(error: any, context: string): WikibaseError { | ||
| // Map nodemw errors to application errors | ||
| if (error.code === 'ENOTFOUND') return 'CONNECTION_FAILED' | ||
| if (error.code === 'ECONNREFUSED') return 'INSTANCE_UNAVAILABLE' | ||
| if (error.response?.status === 401) return 'AUTHENTICATION_FAILED' | ||
| if (error.response?.status === 429) return 'RATE_LIMITED' | ||
|
|
||
| return 'CONNECTION_FAILED' | ||
| } | ||
|
|
||
| static getRetryStrategy(error: WikibaseError): RetryStrategy { | ||
| switch (error) { | ||
| case 'RATE_LIMITED': | ||
| return { retry: true, delay: 5000, maxRetries: 3 } | ||
| case 'CONNECTION_FAILED': | ||
| return { retry: true, delay: 1000, maxRetries: 2 } | ||
| default: | ||
| return { retry: false } | ||
| } | ||
| } | ||
| } | ||
| ``` |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Map nodemw/MW API errors correctly; don’t rely on Axios-style response.status
Nodemw surfaces network errors and API JSON errors (action=query errors) differently. Enhance error mapping to inspect MW API { error: { code, info } } and respect “maxlag” handling.
export class WikibaseErrorHandler {
static handle(error: any, context: string): WikibaseError {
- // Map nodemw errors to application errors
+ // Map nodemw/network errors
if (error.code === 'ENOTFOUND') return 'CONNECTION_FAILED'
if (error.code === 'ECONNREFUSED') return 'INSTANCE_UNAVAILABLE'
- if (error.response?.status === 401) return 'AUTHENTICATION_FAILED'
- if (error.response?.status === 429) return 'RATE_LIMITED'
+ // Map MediaWiki API errors
+ const mwErr = error?.body?.error || error?.error
+ if (mwErr?.code === 'mwoauth-invalid-authorization' || mwErr?.code === 'assertuserfailed') return 'AUTHENTICATION_FAILED'
+ if (mwErr?.code === 'ratelimited' || mwErr?.code === 'maxlag') return 'RATE_LIMITED'
return 'CONNECTION_FAILED'
}
static getRetryStrategy(error: WikibaseError): RetryStrategy {
switch (error) {
case 'RATE_LIMITED':
- return { retry: true, delay: 5000, maxRetries: 3 }
+ return { retry: true, delay: 5000, maxRetries: 3, backoff: 'exponential' }
case 'CONNECTION_FAILED':
- return { retry: true, delay: 1000, maxRetries: 2 }
+ return { retry: true, delay: 1000, maxRetries: 2, backoff: 'exponential' }
default:
return { retry: false }
}
}
}Also plan to send maxlag param on reads to be a good citizen.
I can add a maxlag-aware request wrapper for the client.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Enhanced error types | |
| export type WikibaseError = | |
| | 'CONNECTION_FAILED' | |
| | 'AUTHENTICATION_FAILED' | |
| | 'RATE_LIMITED' | |
| | 'PROPERTY_NOT_FOUND' | |
| | 'CONSTRAINT_VIOLATION' | |
| | 'VALIDATION_FAILED' | |
| | 'CACHE_ERROR' | |
| | 'INSTANCE_UNAVAILABLE' | |
| // Error handling service | |
| export class WikibaseErrorHandler { | |
| static handle(error: any, context: string): WikibaseError { | |
| // Map nodemw errors to application errors | |
| if (error.code === 'ENOTFOUND') return 'CONNECTION_FAILED' | |
| if (error.code === 'ECONNREFUSED') return 'INSTANCE_UNAVAILABLE' | |
| if (error.response?.status === 401) return 'AUTHENTICATION_FAILED' | |
| if (error.response?.status === 429) return 'RATE_LIMITED' | |
| return 'CONNECTION_FAILED' | |
| } | |
| static getRetryStrategy(error: WikibaseError): RetryStrategy { | |
| switch (error) { | |
| case 'RATE_LIMITED': | |
| return { retry: true, delay: 5000, maxRetries: 3 } | |
| case 'CONNECTION_FAILED': | |
| return { retry: true, delay: 1000, maxRetries: 2 } | |
| default: | |
| return { retry: false } | |
| } | |
| } | |
| } | |
| ``` | |
| // Enhanced error types | |
| export type WikibaseError = | |
| | 'CONNECTION_FAILED' | |
| | 'AUTHENTICATION_FAILED' | |
| | 'RATE_LIMITED' | |
| | 'PROPERTY_NOT_FOUND' | |
| | 'CONSTRAINT_VIOLATION' | |
| | 'VALIDATION_FAILED' | |
| | 'CACHE_ERROR' | |
| | 'INSTANCE_UNAVAILABLE' | |
| // Error handling service | |
| export class WikibaseErrorHandler { | |
| static handle(error: any, context: string): WikibaseError { | |
| // Map nodemw/network errors | |
| if (error.code === 'ENOTFOUND') return 'CONNECTION_FAILED' | |
| if (error.code === 'ECONNREFUSED') return 'INSTANCE_UNAVAILABLE' | |
| // Map MediaWiki API errors | |
| const mwErr = error?.body?.error || error?.error | |
| if ( | |
| mwErr?.code === 'mwoauth-invalid-authorization' || | |
| mwErr?.code === 'assertuserfailed' | |
| ) return 'AUTHENTICATION_FAILED' | |
| if (mwErr?.code === 'ratelimited' || mwErr?.code === 'maxlag') return 'RATE_LIMITED' | |
| return 'CONNECTION_FAILED' | |
| } | |
| static getRetryStrategy(error: WikibaseError): RetryStrategy { | |
| switch (error) { | |
| case 'RATE_LIMITED': | |
| return { retry: true, delay: 5000, maxRetries: 3, backoff: 'exponential' } | |
| case 'CONNECTION_FAILED': | |
| return { retry: true, delay: 1000, maxRetries: 2, backoff: 'exponential' } | |
| default: | |
| return { retry: false } | |
| } | |
| } | |
| } |
🤖 Prompt for AI Agents
.kiro/specs/mwn-wikibase-integration/design.md around lines 347-381: Current
mapping assumes Axios-style response.status; update the error handling to
inspect nodemw/MW API shapes and network errors: check for network errno values
(ENOTFOUND, ECONNREFUSED, ETIMEDOUT) as before, but for MW API errors inspect
error.error?.code and error.error?.info (e.g. if error.error.code === 'maxlag'
treat as 'RATE_LIMITED' and set a retry strategy that respects MW's suggested
wait, if present parse info for retry delay), map action=query JSON API errors
(like 'badtoken'/'permissiondenied' etc.) to AUTHENTICATION_FAILED or
CONSTRAINT_VIOLATION/VALIDATION_FAILED as appropriate, and stop relying on
error.response?.status; also note to include sending a maxlag parameter on read
requests (add maxlag to read request wrapper) so the client is maxlag-aware.
| **User Story:** As a developer, I want to implement efficient caching and performance optimization using nodemw's built-in features, so that the schema editor provides responsive user experience while respecting API rate limits and server resources. | ||
|
|
||
| #### Acceptance Criteria | ||
|
|
||
| 1. WHEN making repeated API requests THEN the system SHALL leverage nodemw's built-in request queuing to reduce server load and manage concurrency | ||
| 2. WHEN API rate limits are approached THEN the system SHALL use nodemw's built-in throttling and queue management for automatic request pacing | ||
| 3. WHEN caching property and constraint data THEN the system SHALL implement appropriate cache invalidation strategies on top of nodemw's request handling | ||
| 4. WHEN handling large datasets THEN the system SHALL use nodemw's parallel processing capabilities with configurable concurrency limits | ||
| 5. IF network issues occur THEN the system SHALL rely on nodemw's automatic error handling and retry mechanisms for resilient operation |
There was a problem hiding this comment.
Grammar tweak: add article “a” in “a responsive user experience”
Minor clarity fix.
-**User Story:** As a developer, I want to implement efficient caching and performance optimization using nodemw's built-in features, so that the schema editor provides responsive user experience while respecting API rate limits and server resources.
+**User Story:** As a developer, I want to implement efficient caching and performance optimization using nodemw's built-in features, so that the schema editor provides a responsive user experience while respecting API rate limits and server resources.📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| **User Story:** As a developer, I want to implement efficient caching and performance optimization using nodemw's built-in features, so that the schema editor provides responsive user experience while respecting API rate limits and server resources. | |
| #### Acceptance Criteria | |
| 1. WHEN making repeated API requests THEN the system SHALL leverage nodemw's built-in request queuing to reduce server load and manage concurrency | |
| 2. WHEN API rate limits are approached THEN the system SHALL use nodemw's built-in throttling and queue management for automatic request pacing | |
| 3. WHEN caching property and constraint data THEN the system SHALL implement appropriate cache invalidation strategies on top of nodemw's request handling | |
| 4. WHEN handling large datasets THEN the system SHALL use nodemw's parallel processing capabilities with configurable concurrency limits | |
| 5. IF network issues occur THEN the system SHALL rely on nodemw's automatic error handling and retry mechanisms for resilient operation | |
| **User Story:** As a developer, I want to implement efficient caching and performance optimization using nodemw's built-in features, so that the schema editor provides a responsive user experience while respecting API rate limits and server resources. |
🧰 Tools
🪛 LanguageTool
[grammar] ~95-~95: There might be a mistake here.
Context: ...-in features, so that the schema editor provides responsive user experience while respec...
(QB_NEW_EN)
🤖 Prompt for AI Agents
.kiro/specs/mwn-wikibase-integration/requirements.md around lines 95 to 103: the
User Story sentence is missing the article "a" before "responsive user
experience"; update the sentence to read "...so that the schema editor provides
a responsive user experience while respecting API rate limits and server
resources." Ensure spacing and punctuation remain correct.
No description provided.