Skip to content

Harden Streamgraph data loading by removing fragile covid_month.csv remote dependency#271

Closed
Copilot wants to merge 1 commit intomainfrom
copilot/fix-data-loading-error
Closed

Harden Streamgraph data loading by removing fragile covid_month.csv remote dependency#271
Copilot wants to merge 1 commit intomainfrom
copilot/fix-data-loading-error

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 4, 2026

The render failed at DataOverTime/Streamgraph.qmd while reading covid_month.csv from COS (cannot open the connection), which aborted the full babelquarto::render_website() job. The failure originated from a non-essential second remote read in the tutorial data-prep chunk.

  • Root-cause scope

    • Failure point was specific to covid_month.csv fetch in Streamgraph EN/ZH tutorials.
    • covid_all.csv read remained sufficient to derive equivalent monthly aggregates used for demonstration.
  • Change made (EN + ZH parity)

    • Updated:
      • DataOverTime/Streamgraph.qmd
      • DataOverTime/Streamgraph.zh.qmd
    • Replaced direct readr::read_csv(".../covid_month.csv") with a deterministic derivation from covid_all:
      • extract month from time
      • group by location, month
      • summarize count
  • Behavioral impact

    • Removes a flaky network dependency from the load-data chunk.
    • Keeps tutorial semantics intact by producing covid_month from already-loaded source data.
covid_month <- covid_all %>%
  dplyr::mutate(month = as.integer(format(as.Date(time), "%m"))) %>%
  dplyr::group_by(location, month) %>%
  dplyr::summarise(count = sum(count), .groups = "drop")

Agent-Logs-Url: https://github.com/openbiox/Bizard/sessions/d3675894-307b-40d5-b2b9-2238fd7d7a75

Co-authored-by: ShixiangWang <25057508+ShixiangWang@users.noreply.github.com>
@ShixiangWang
Copy link
Copy Markdown
Member

偶发性的数据不可读?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants