Skip to content

Redo doc structure (WIP)#210

Draft
jeffcarp wants to merge 2 commits intokeras-team:mainfrom
jeffcarp:redo-doc-structure-v2
Draft

Redo doc structure (WIP)#210
jeffcarp wants to merge 2 commits intokeras-team:mainfrom
jeffcarp:redo-doc-structure-v2

Conversation

@jeffcarp
Copy link
Copy Markdown
Member

No description provided.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request reorganizes the documentation by moving training examples into a dedicated directory, updating the main index structure, and adjusting CSS for navigation. It also adds standard file exclusions to the Sphinx configuration. Feedback identifies several incorrect relative links resulting from the file moves, particularly for the LLM Fine-tuning guide.

Comment thread docs/examples.md
- [LLM Fine-tuning](llm_finetuning.md): extended walkthrough using the
- [Getting Started](getting_started.md): your first run, end-to-end.
- [Keras Training](examples/keras_training.md): patterns for Keras users.
- [LLM Fine-tuning](guides/llm_finetuning.md): extended walkthrough using the
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The link to the LLM Fine-tuning guide is broken. The file was moved from docs/guides/ to docs/examples/. Since this file (docs/examples.md) is now located at the root of the docs/ directory, the link should point to examples/llm_finetuning.md.

Suggested change
- [LLM Fine-tuning](guides/llm_finetuning.md): extended walkthrough using the
- [LLM Fine-tuning](examples/llm_finetuning.md): extended walkthrough using the

See [Distributed Training](distributed_training.md) for the full
multi-host setup, and [LLM Fine-tuning](llm_finetuning.md) for a
See [Distributed Training](../guides/distributed_training.md) for the full
multi-host setup, and [LLM Fine-tuning](../guides/llm_finetuning.md) for a
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The link to the LLM Fine-tuning guide is incorrect. Both keras_training.md and llm_finetuning.md are now located in the same docs/examples/ directory, so the link should be relative to the current directory rather than pointing to the old guides/ path.

Suggested change
multi-host setup, and [LLM Fine-tuning](../guides/llm_finetuning.md) for a
multi-host setup, and [LLM Fine-tuning](llm_finetuning.md) for a

- [Checkpointing](../guides/checkpointing.md) — `KINETIC_OUTPUT_DIR` and resumable
training.
- [LLM Fine-tuning](llm_finetuning.md) — KerasHub + Gemma walkthrough.
- [LLM Fine-tuning](../guides/llm_finetuning.md) — KerasHub + Gemma walkthrough.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The link to the LLM Fine-tuning guide is incorrect. Since both files are in the same directory (docs/examples/), use a direct relative link.

Suggested change
- [LLM Fine-tuning](../guides/llm_finetuning.md) — KerasHub + Gemma walkthrough.
- [LLM Fine-tuning](llm_finetuning.md) — KerasHub + Gemma walkthrough.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant