include::modules/con-common-configuration-properties.adoc[leveloffset=+1]
include::modules/appendix_crds.adoc[]
This guide provides information to assist contributors to the Strimzi documentation.
Strimzi documentation is written in Asciidoc.
Strimzi is committed to using inclusive and respectful language in its code, web pages, and documentation. Contributions to the documentation, as with code submissions, must not contain problematic terms or phrases.
The following tools are needed to build the documentation:
Documentation generation tool
make
Make build system to build complete documentation
yq
YAML build tool to build documentation using make targets
Additionally, for generating API reference content you also need the following:
For most documentation updates, Asciidoctor offers the simplest way to check the build.
Strimzi uses public GitHub repositories to host files. The following repositories contain source documentation files.
strimzi-kafka-operator
(GitHub)Strimzi operators code and related documentation.
strimzi-kafka-bridge
(GitHub)Kafka Bridge code and related documentation.
strimzi.github.io
(GitHub)Strimzi web site code and quick start documentation.
The main Strimzi documentation is maintained in the /documentation
folder of the Strimzi Operators repository.
The documentation folder is split into category folders to manage the content.
The folders contain files related to Strimzi guides and the files that provide the content for one or more of these guides – assemblies and modules. Assemblies, which usually encapsulate a feature or process, bring the related content contained in modules together. An assembly is like a sub-section or chapter in a book. A module contain a procedure, concepts or reference content.
The intention of the Strimzi Overview guide is for developing an understanding of Strimzi and Apache Kafka. The guide does not contain any instructions. It provides an overview of the concepts behind Apache Kafka, the Kafka component architecture, and how Strimzi supports Kafka in a Kubernetes cluster. The guide also describes how Strimzi Operators help manage a deployment.
The guide contains high-level outlines of the processes required to deploy, configure, secure and monitor a deployment of Strimzi.
The Deploying and Managing Strimzi guide provides instructions on all the options available for deploying, managing and upgrading Strimzi. The guide describes what is deployed, and the order of deployment required to run Apache Kafka in a Kubernetes cluster.
As well as describing the deployment steps, the guide also provides pre- and post-deployment instructions to prepare for and verify a deployment. Additional deployment options described include the steps to introduce metrics.
Examples that show how you might configure components are provided. For example, you might want to modify your deployment and introduce additional features, such as Cruise Control or distributed tracing.
Upgrade instructions are provided for Strimzi and Kafka upgrades.
The Strimzi Custom Resource API Reference guide describes the configuration properties for custom resources.
The reference guide is built from two files.
include::modules/con-common-configuration-properties.adoc[leveloffset=+1]
include::modules/appendix_crds.adoc[]
The con-common-configuration-properties.adoc
file contains descriptions of common configuration properties.
The content for the documentation/modules/appendix_crds.adoc
file is generated directly from descriptions in the Java code.
Java files in the api/
folder are annotated so that the descriptions are picked up in the build.
import io.strimzi.crdgenerator.annotations.Description;
import io.strimzi.crdgenerator.annotations.DescriptionFile;
The tables in appendix_crds.adoc
are built from @Description
annotations in the Java files.
Additional information is included by adding:
An @DescriptionFile
annotation to the Java file
A corresponding description file (.adoc
) in the documentation/api/
folder
An include:DESCRIPTION-FILE-NAME
reference to the appendix_crds.adoc
The include:DESCRIPTION-FILE-NAME
reference is added automatically by the Maven build, so you do not need to add it manually.
For example, to add additional configuration for the KafkaUserQuotas
custom resource:
api/src/main/java/io/strimzi/api/kafka/model/KafkaUserQuotas.java
contains:
import io.strimzi.crdgenerator.annotations.Description
import io.strimzi.crdgenerator.annotations.DescriptionFile
@Description("descriptions for individual properties…");
An @DescriptionFile
annotation
documentation/api
includes the io.strimzi.api.kafka.model.KafkaUserQuotas.adoc
file containing the additional configuration description.
The description file requires the same name as the related Java package.
appendix_crds.adoc
contains a reference to include the additional configuration description:
### `KafkaUserQuotas` schema reference
/include::../api/io.strimzi.api.kafka.model.user.KafkaUserQuotas.adoc[leveloffset=+1]
If you change anything in the api
module of the Java code, you must rebuild the Strimzi Custom Resource API Reference using a make command.
The Kafka Bridge documentation shows how to get started using the Kafka Bridge to make HTTP requests to a Kafka cluster.
The Kafka Bridge documentation is maintained in the Kafka Bridge project in GitHub.
For information on contributing to the Kafka Bridge documentation, see the readme
in the Kafka Bridge project.
Strimzi quick starts provide instructions for evaluating Strimzi using Minikube, Kubernetes kind, or Docker Desktop. Steps describe how to deploy and run Strimzi as quickly as possible, with minimal configuration.
The quick starts are maintained in the Strimzi website project in GitHub.
For information on contributing to the quick starts, see the readme
in the Strimzi website project.
This section shows you how to contribute to the Strimzi documentation and contains important guidelines for creating accessible content.
Strimzi aims to make its content accessible. When contributing to the documentation, ensure that your content adheres to the following guidelines:
Images:
Include a caption
Provide alternative text
Are described in the surrounding text
Images of text are not used (such as with code fragments)
Links provide descriptive text about the target (not click here)
Tables:
Include a caption
Contain headings
Provide a logical reading order
Don’t contain empty cells
Color is not used as the only visual means to convey information (not check the green text)
.Operators within the Strimzi architecture
image:operators.png[Operators within the Strimzi architecture]
.File connectors
[cols="2*",options="header"]
|===
|File Connector
|Description
|FileStreamSourceConnector
|Transfers data to your Kafka cluster from a file (the source).
|===
Once you have your local repository set up and have up-to-date copies of upstream content, follow these steps to contribute to the documentation.
You add content to the documentation hosted on GitHub using pull requests (PRs).
Reviewers might be assigned to the PR depending on the changes. A review from a Subject Matter Expert (SME) will check the technical aspects of the content. The PR might not require an SME review if you’re only fixing a typo or broken link.
Open your terminal
cd
to the directory where your documentation resides
Checkout the main branch
$ git checkout main
Update your local repository and fork with the upstream content
$ git pull upstream main
$ git push origin main --force
Create a new branch for your work (using the issue number is helpful)
$ git checkout -b <branch-name>
Make your edits in the editor of your choice
Save your changes locally
Build your documentation to verify that there are no build errors and that everything looks right
This can be done with the Make tooling
If you are creating new files, add the files to the repository
$ git status
$ git add <file-name>
Commit your changes
$ git commit -a -s -m "<message>"
Note that the project requires all commits to be signed off, indicating that you certify the changes with the developer certificate of origin (DCO).
Push your changes to your fork
$ git push origin HEAD
If the update is rejected because the commit is behind, merge your changes
$ git pull upstream main
$ git push -f origin HEAD
Visit your fork on GitHub
Click Compare & pull request
Tips for which AsciiDoc markup to use when formatting text.
`
)Item | Example |
---|---|
File names and paths |
The `/home/user/.bashrc` file … |
Literal values |
If the value is `true` … |
Configuration attributes |
Set the `enabled` attribute … |
Java class names |
The `java.sql.Connection` class … |
Java annotations |
The `@Clustered` annotation … |
_
)Item | Example |
---|---|
Guide titles |
See the _Installation Guide_ … |
Emphasis for a term (only emphasize first time) |
_High availability_ refers to the … |
*
)Item | Example |
---|---|
GUI items (menus, buttons) |
Click the *Add* button and … |
value
)Item | Example |
---|---|
Default item in a multi-select |
`yes|[.underline]#no#|maybe` |
The Strimzi documentation contains many code and configuration examples. Examples are a useful way of demonstrating a process.
If you want to add example code and configuration to your contribution, use the following format in an asciidoc code block.
[source,yaml,subs="+quotes,attributes"] (1)
----
apiVersion: {KafkaApiVersion}
kind: Kafka
metadata:
name: my-cluster
spec:
kafka:
replicas: 3 # <1> (2)
# ... (3)
----
Syntax for the example. Here the source language is yaml
. Use subs to apply attribute (attributes
) and formatting (subs
).
In this example, kafka.strimzi.io/v1beta2
is substituted with an attribute value specified in the /shared/attributes.adoc
file.
If quotes is specified in subs, asciidoc formatting is applied to the code block. Here, bold is applied to the 3 value of replicas
. If quotes isn’t used, asciidoc formatting is ignored in the code block.
Add callouts to describe a line of code or configuration. Use a hash (#
) before the callout number so that the example is copy-friendly.
Use a hash and ellipsis (# …
) to show that part of the code or configuration has been omitted.
Uses sentence-case headings for modules, tables, and figures. For example, Secrets generated by the operators not Secrets Generated By The Operators.
Each file in the documentation requires an ID.
The ID takes the form [id="name-of-file-{context}"]
.
Style user-replaced values (replaceables) with angle brackets (< >).
Use underscores ( _ ) for multi-word values.
If you are referencing code or commands, also use monospace
.
Value | Shows as |
---|---|
<my_replaceable> |
<my_replaceable> |
<my_code_replaceable> |
|
Tip
|
If adding a user-replaced value within a source code block, add
subs="+quotes" to the source tag for it to render. (For example : [source,shell,subs="+quotes"] ).
|
Refer links to the top-level sections of books as chapters, not sections or topics.
Link type | Use |
---|---|
External links |
link:github.com[GitHub^] |
Internal links |
xref:doc_id[Section Title] |
Note
|
If you use the caret syntax (^) more than once in a single paragraph, you may need to escape the first occurrence with a backslash. |
Add images for screenshots, diagrams and so on in the following format:
.Title of image
\image:<image_name>.png[<description_of_image>]
You can also add inline images, such as in steps for procedures:
. My step.
+
.My inline image.
image:<image_name>.png[<description_of_image>]
When you make changes to the documentation, it is a good practice to do a local test build to verify the book builds successfully and renders as you expect before you submit the merge request back to upstream main.
After you have made documentation updates in your local GitHub clone of the strimzi-kafka-operator
project, you can build the documentation using AsciiDoctor or the Makefile
provided with the project.
As the documentation is based on asciidoc, you can use AsciiDoctor to build the guides locally.
To build a guide using AsciiDoctor on your local machine, you run the asciidoctor
command for the source file of the guide.
For example, this command builds the Overview:
asciidoctor <path_to_overview.adoc>
Use make commands from the root of the strimzi-kafka-operator
project to build all the documentation at the same time.
The documentation is output to documentation/html
.
make docu_clean
Deletes all temporary files
make docu_check
Executes the documentation checks in .azure/scripts/check_docs.sh
make docu_html
Generates HTML versions of all the guides
make docu_htmlnoheader
Generates HTML versions of all the guides without the HTML headers so they are suitable for including in a website
Note
|
Before running the make commands, ensure that you remove any old build files from the documentation directories.
Failure to do so will result in these files being included in the documentation checks and causing build failures.
For instance, if you previously generated an HTML file in the deploying directory using Asciidoctor, make sure to delete that file.
|
The documentation/modules/appendix_crds.adoc
file provides the main content for the Strimzi Custom Resource API Reference.
It is generated directly from the Java code when building the operators.
If you change the Strimzi API, you need to regenerate the API Reference before submitting your PR by running the following from the root:
mvn clean -DskipTests install
make crd_install
The build uses yq
, so make sure it is kept up-to-date for it to work properly.
Note
|
You only have to generate the Strimzi Custom Resource API Reference if you changed anything in the api module of the Java code.
|
Standard attributes |
shared/attributes.adoc |
Shared includes |
shared/ |
Images |
shared/images/ |
To optimize modular documentation, follow these guidelines for naming module anchors and files:
Provide an anchor in the format [id='anchor-name']
for every module so that it can be identified by Asciidoctor when reused or cross-referenced. Give the anchor the same or similar name as the module heading, separated by dashes:
[id='anchor-name']
= Module Heading
First sentence of topic.
Note
|
Note on other anchor formats (Not Recommended)
The format defined here is recommended because it is the most stable and versatile of anchor formats, and supports variables that enable topics to be reused and cross-referenced properly. Other anchor formats include |
Name the module file using the same name as the anchor used in it, which should also align with or resemble the module heading. Separate these elements with dashes. Add a prefix with an underscore to the file name to indicate the module type in the format prefix-file-name
. Use snip-
for a snippet, con-
for concept, ref-
for reference, proc-
for procedure, assembly-
for assembly, and image-
for images and screenshots.
snip-guided-decision-urls.adoc
(Snippet of reusable content)
con-guided-decision-tables.adoc
(Concept module)
proc-creating-guided-decision-tables.adoc
(Procedure module for creating)
proc-editing-guided-decision-tables.adoc
(Procedure module for editing)
ref-guided-decision-table-examples.adoc
(Reference module with examples)
ref-guided-decision-table-columns.adoc
(Reference module with column types)
assembly-guided-decision-tables.adoc
(Assembly of guided decision table modules)
image-guided-decision-example.adoc
(Screenshot or image of guided decision table modules)
Learn more in the Modular Documentation Reference Guide.
This section explains how to set up your system to connect to the proper git repositories.
If using Fedora, open your terminal and enter the proper installation command.
$ yum install git (up to Fedora 21)
$ dnf install git (Fedora 22 and later)
Once you have git installed you’ll want to set up your git account.
Open Terminal
Set your name and email
$ git config --global user.name "<your-name>"
$ git config --global user.email "<your-email>"
Tip
|
The email you specify should be the same one found in your email settings. To keep your email address hidden, see Keeping your email address private. |
Set your git defaults
$ git config --global pull.rebase true
$ git config --global push.default simple
Fork the strimzi-kafka-operator
upstream repository to create a copy under your own GitHub ID. This allows you to work on multiple features and push changes to branches in your own GitHub instance so that you do not have to worry about losing work. When you are ready, you can request the changes to be merged back into the upstream repository.
Open a browser and navigate to the upstream repository located at https://github.com/strimzi/strimzi-kafka-operator
Click Fork located in the upper right under your profile icon.
Select your user account for the location of the forked repository. This creates your own copy of the repository under your own GitHub ID.
If you choose to use the SSH address for your clones, you will need to add an SSH Key to GitHub first.
Open Terminal.
Check to see if you have a public SSH key:
$ ls ~/.ssh/
If you don’t have a key, generate one:
$ ssh-keygen -t rsa -C "<your-email>"
Open your key in an editor:
$ cd ~/.ssh/
$ vi id_rsa.pub
Copy the contents of the file to your clipboard.
Click New SSH Key.
Give your key a name and paste the contents of your key file.
Click Add SSH Key.
Clone your forked repository to bring your GitHub repository files to your local machine. Your forked repository is now the origin
repository for your local files.
Note
|
For more information about forking and cloning, consult the official GitHub documentation. |
Open Terminal.
Navigate to the directory where you want to create the new repository folder.
Type the following command:
$ git clone git@github.com:<username>/strimzi-kafka-operator.git
Navigate to the newly created strimzi-kafka-operator
folder.
$ cd strimzi-kafka-operator/
While there are fewer steps in this option, you have to enter your GitHub credentials with every change you make.
Open Terminal.
Navigate to the directory where you want to create the new repository folder.
Type the following command:
$ git clone https://github.com/<username>/strimzi-kafka-operator.git
Enter your GitHub credentials to complete the clone.
Navigate to the newly created strimzi-kafka-operator
folder.
$ cd strimzi-kafka-operator/
Once you have your fork checked out and cloned locally, add the downstream repository as a remote.
List the current remote repositories:
$ git remote -v
origin git@github.com:<username>/strimzi-kafka-operator.git (fetch)
origin git@github.com:<username>/strimzi-kafka-operator.git (push)
Add the upstream as a remote repository and fetch its contents. This allows you to check out and work with the latest source code.
$ git remote add -f upstream git@github.com:strimzi/strimzi-kafka-operator.git
Enter your GitHub credentials to complete the remote add process.
Verify the new remote was added:
$ git remote -v
origin git@github.com:<username>/strimzi-kafka-operator.git (fetch)
origin git@github.com:<username>/strimzi-kafka-operator.git (push)
upstream git@github.com:strimzi/strimzi-kafka-operator.git (fetch)
upstream git@github.com:strimzi/strimzi-kafka-operator.git (push)
List the current remote repositories:
$ git remote -v
origin https://github.com/<username>/strimzi-kafka-operator.git (fetch)
origin https://github.com/<username>/strimzi-kafka-operator.git (push)
Add the upstream as a remote repository and fetch its contents. This allows you to check out and work with the latest source code.
$ git remote add -f upstream https://github.com/strimzi/strimzi-kafka-operator.git
Enter your GitHub credentials to complete the remote add process.
Verify the new remote was added:
$ git remote -v
origin https://github.com/<username>/strimzi-kafka-operator.git (fetch)
origin https://github.com/<username>/strimzi-kafka-operator.git (push)
upstream https://github.com/strimzi/strimzi-kafka-operator.git (fetch)
upstream https://github.com/strimzi/strimzi-kafka-operator.git (push)
If the upstream repository is moved, you can change the downstream URL by using the following command:
$ git remote set-url upstream https://github.com/strimzi/strimzi-kafka-operator.git
Use the following command any time you need to fetch the latest source code locally:
$ git fetch upstream
To delete all of your branches except the branch you are on:
$ git checkout main
$ for br in `git branch` ; do git branch -D $br ; done
To delete one branch:
$ git checkout main
$ git branch -D <branch-name>
To resolve a merge conflict in an existing pull request:
$ git checkout <branch-name>
$ git branch -u origin <branch-name>
$ git pull --rebase upstream main
$ git push -f origin HEAD
If your fork is both ahead of and behind the origin you can reset your fork to match the origin and start with a clean slate.
$ git checkout main
$ git reset --hard upstream/main
$ git push origin main --force
$ git pull upstream main
$ git push origin main --force
ssh-agent
to save your SSH key’s passphraseIf you have to enter your SSH key’s passphrase whenever working with the repository from the command line, you might want to use the ssh-agent to remember the passphrase for you.
Before using the ssh-agent
you will see a prompt to enter your passphrase after each git command.
[amq-repo]$ git pull --rebase upstream main
Enter passphrase for key '/home/<username>/.ssh/id_rsa':
To add your passphrase to the ssh-agent
:
[amq-repo]$ ssh-add
Enter passphrase for /home/<username>/.ssh/id_rsa:
After entering your passphrase you will see confirmation that your passphrase has been saved:
Identity added: /home/<username>/.ssh/id_rsa (/home/<username>/.ssh/id_rsa)
This is the process you can use if you need commits another writer has submitted in a merge request that is not yet merged.
Check out a new topic branch from upstream/main as you normally do.
$ git fetch upstream
$ git checkout -b <new-topic-branch> upstream/main
If you have not yet added that writer’s remote repository, add it now.
$ git remote add -f <user> git@github.com:<user>/strimzi-kafka-operator.git
Rebase to bring in the changes that are in that user’s outstanding
origin/<merge-request-branch>
branch.
$ git rebase <user>/<merge-request-branch>
(you’ll see the following response)
First, rewinding head to replay your work on top of it...
Fast-forwarded <new-topic-branch> to <user>/<merge-request-branch>
Revised on 2024-12-21 13:26:40 UTC