diff --git a/2.0.0-SNAPSHOT/developer_guide/workflow/index.html b/2.0.0-SNAPSHOT/developer_guide/workflow/index.html index e43836770a..5dc1cac7be 100644 --- a/2.0.0-SNAPSHOT/developer_guide/workflow/index.html +++ b/2.0.0-SNAPSHOT/developer_guide/workflow/index.html @@ -1061,7 +1061,7 @@
plugins {
- id("org.jetbrains.dokka") version "1.9.10-my-fix-SNAPSHOT"
+ id("org.jetbrains.dokka") version "1.9.20-my-fix-SNAPSHOT"
}
diff --git a/2.0.0-SNAPSHOT/search/search_index.json b/2.0.0-SNAPSHOT/search/search_index.json
index 8764dc66f2..79d77b39c2 100644
--- a/2.0.0-SNAPSHOT/search/search_index.json
+++ b/2.0.0-SNAPSHOT/search/search_index.json
@@ -1 +1 @@
-{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Dokka \u00b6 Dokka is an API documentation engine for Kotlin. If you want to learn how to use Dokka, see documentation on kotlinlang.org . If you want to learn more about Dokka's internals and/or how to write Dokka plugins, see Developer guides .","title":"Dokka"},{"location":"#dokka","text":"Dokka is an API documentation engine for Kotlin. If you want to learn how to use Dokka, see documentation on kotlinlang.org . If you want to learn more about Dokka's internals and/or how to write Dokka plugins, see Developer guides .","title":"Dokka"},{"location":"developer_guide/introduction/","text":"Developer guides \u00b6 The purpose of the Developer guides documentation is to get you acquainted with Dokka's internals so that you can start developing your own plugins or contributing features and fixes to Dokka itself. If you want to start hacking on Dokka right away, the only thing you need to be aware of is the general workflow : it will teach you how to build, debug and test Dokka locally. CONTRIBUTING.md contains information that can be useful if you want to contribute to Dokka. If you want to get into plugin development quick, see Introduction to plugin development . If you have time to spare and want to know more about Dokka's internals, its architecture and capabilities, follow Architecture overview and subsequent sections inside Internals . Having read through all the developer guides, you'll have a pretty good understanding of Dokka and how to develop for it. If you have any questions, feel free to get in touch with maintainers via Slack or GitHub .","title":"Developer guides"},{"location":"developer_guide/introduction/#developer-guides","text":"The purpose of the Developer guides documentation is to get you acquainted with Dokka's internals so that you can start developing your own plugins or contributing features and fixes to Dokka itself. If you want to start hacking on Dokka right away, the only thing you need to be aware of is the general workflow : it will teach you how to build, debug and test Dokka locally. CONTRIBUTING.md contains information that can be useful if you want to contribute to Dokka. If you want to get into plugin development quick, see Introduction to plugin development . If you have time to spare and want to know more about Dokka's internals, its architecture and capabilities, follow Architecture overview and subsequent sections inside Internals . Having read through all the developer guides, you'll have a pretty good understanding of Dokka and how to develop for it. If you have any questions, feel free to get in touch with maintainers via Slack or GitHub .","title":"Developer guides"},{"location":"developer_guide/workflow/","text":"Workflow \u00b6 Whether you're contributing a feature/fix to Dokka itself or developing a Dokka plugin, there are 3 essential things you need to know how to do: How to build Dokka or a plugin How to use/test locally built Dokka in a project How to debug Dokka or a plugin in IntelliJ IDEA We'll go over each step individually in this section. Examples below will be specific to Gradle and Gradle\u2019s Kotlin DSL , but you can apply the same principles and run/test/debug with CLI/Maven runners and build configurations if you wish. Build Dokka \u00b6 Building Dokka is pretty straightforward, with one small caveat: when you run ./gradlew build , it will run integration tests as well, which might take some time and will consume a lot of RAM, so you would usually want to exclude integration tests when building locally. ./gradlew build -x integrationTest Unit tests which are run as part of build should not take much time, but you can also skip it with -x test . Troubleshooting build \u00b6 API check failed for project .. \u00b6 If you see a message like API check failed for project .. during the build phase, it indicates that the binary compatibility check has failed, meaning you've changed/added/removed some public API. If the change was intentional, run ./gradlew apiDump - it will re-generate .api files with signatures, and you should be able to build Dokka with no errors. These updated files need to be committed as well. Maintainers will review API changes thoroughly, so please make sure it's intentional and rational. Use / test locally built Dokka \u00b6 Having built Dokka locally, you can publish it to mavenLocal() . This will allow you to test your changes in another project as well as debug code remotely. Change dokka_version in gradle.properties to something that you will use later on as the dependency version. For instance, you can set it to something like 1.9.10-my-fix-SNAPSHOT . This version will be propagated to plugins that reside inside Dokka's project (such as mathjax , kotlin-as-java , etc). Publish it to Maven Local ( ./gradlew publishToMavenLocal ). Corresponding artifacts should appear in ~/.m2 In the project you want to generate documentation for or debug on, add maven local as a plugin/dependency repository: repositories { mavenLocal () } Update your Dokka dependency to the version you've just published: plugins { id ( \"org.jetbrains.dokka\" ) version \"1.9.10-my-fix-SNAPSHOT\" } After completing these steps, you should be able to build documentation using your own version of Dokka. Debugging Dokka \u00b6 Dokka is essentially a Gradle plugin, so you can debug it the same way you would any other Gradle plugin. Below you'll find instructions on how to debug Dokka's internal logic, but you can apply the same principles if you wish to debug a Dokka plugin. Choose a project to debug on, it needs to have some code for which documentation will be generated. Prefer using smaller projects that reproduce the exact problem or behaviour you want since the less code you have, the easier it will be to understand what's going on. You can use example projects found in dokka/examples/gradle , there's both simple single-module and more complex multi-module / multiplatform examples. For the debug project, set org.gradle.debug to true in one of the following ways: In your gradle.properties add org.gradle.debug=true When running Dokka tasks: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon Run the desired Dokka task with --no-daemon . Gradle should wait until you attach with debugger before proceeding with the task, so no need to hurry here. Example: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon . Open Dokka in IntelliJ IDEA, set a breakpoint and, using remote debug in IntelliJ IDEA, Attach to process running on the default port 5005. You can do that either by creating a Remote JVM Debug Run/Debug configuration or by attaching to the process via Run -> Attach to process Note The reason for --no-daemon is that Gradle daemons continue to exist even after the task has completed execution, so you might hang in debug or experience issues with port was already in use if you try to run it again. If you previously ran Dokka with daemons and you are already encountering problems with it, try killing gradle daemons. For instance, via pkill -f gradle.*daemon In case you need to debug some other part of the build - consult the official Gradle tutorials on Troubleshooting Builds .","title":"Workflow"},{"location":"developer_guide/workflow/#workflow","text":"Whether you're contributing a feature/fix to Dokka itself or developing a Dokka plugin, there are 3 essential things you need to know how to do: How to build Dokka or a plugin How to use/test locally built Dokka in a project How to debug Dokka or a plugin in IntelliJ IDEA We'll go over each step individually in this section. Examples below will be specific to Gradle and Gradle\u2019s Kotlin DSL , but you can apply the same principles and run/test/debug with CLI/Maven runners and build configurations if you wish.","title":"Workflow"},{"location":"developer_guide/workflow/#build-dokka","text":"Building Dokka is pretty straightforward, with one small caveat: when you run ./gradlew build , it will run integration tests as well, which might take some time and will consume a lot of RAM, so you would usually want to exclude integration tests when building locally. ./gradlew build -x integrationTest Unit tests which are run as part of build should not take much time, but you can also skip it with -x test .","title":"Build Dokka"},{"location":"developer_guide/workflow/#troubleshooting-build","text":"","title":"Troubleshooting build"},{"location":"developer_guide/workflow/#api-check-failed-for-project","text":"If you see a message like API check failed for project .. during the build phase, it indicates that the binary compatibility check has failed, meaning you've changed/added/removed some public API. If the change was intentional, run ./gradlew apiDump - it will re-generate .api files with signatures, and you should be able to build Dokka with no errors. These updated files need to be committed as well. Maintainers will review API changes thoroughly, so please make sure it's intentional and rational.","title":"API check failed for project .."},{"location":"developer_guide/workflow/#use-test-locally-built-dokka","text":"Having built Dokka locally, you can publish it to mavenLocal() . This will allow you to test your changes in another project as well as debug code remotely. Change dokka_version in gradle.properties to something that you will use later on as the dependency version. For instance, you can set it to something like 1.9.10-my-fix-SNAPSHOT . This version will be propagated to plugins that reside inside Dokka's project (such as mathjax , kotlin-as-java , etc). Publish it to Maven Local ( ./gradlew publishToMavenLocal ). Corresponding artifacts should appear in ~/.m2 In the project you want to generate documentation for or debug on, add maven local as a plugin/dependency repository: repositories { mavenLocal () } Update your Dokka dependency to the version you've just published: plugins { id ( \"org.jetbrains.dokka\" ) version \"1.9.10-my-fix-SNAPSHOT\" } After completing these steps, you should be able to build documentation using your own version of Dokka.","title":"Use / test locally built Dokka"},{"location":"developer_guide/workflow/#debugging-dokka","text":"Dokka is essentially a Gradle plugin, so you can debug it the same way you would any other Gradle plugin. Below you'll find instructions on how to debug Dokka's internal logic, but you can apply the same principles if you wish to debug a Dokka plugin. Choose a project to debug on, it needs to have some code for which documentation will be generated. Prefer using smaller projects that reproduce the exact problem or behaviour you want since the less code you have, the easier it will be to understand what's going on. You can use example projects found in dokka/examples/gradle , there's both simple single-module and more complex multi-module / multiplatform examples. For the debug project, set org.gradle.debug to true in one of the following ways: In your gradle.properties add org.gradle.debug=true When running Dokka tasks: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon Run the desired Dokka task with --no-daemon . Gradle should wait until you attach with debugger before proceeding with the task, so no need to hurry here. Example: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon . Open Dokka in IntelliJ IDEA, set a breakpoint and, using remote debug in IntelliJ IDEA, Attach to process running on the default port 5005. You can do that either by creating a Remote JVM Debug Run/Debug configuration or by attaching to the process via Run -> Attach to process Note The reason for --no-daemon is that Gradle daemons continue to exist even after the task has completed execution, so you might hang in debug or experience issues with port was already in use if you try to run it again. If you previously ran Dokka with daemons and you are already encountering problems with it, try killing gradle daemons. For instance, via pkill -f gradle.*daemon In case you need to debug some other part of the build - consult the official Gradle tutorials on Troubleshooting Builds .","title":"Debugging Dokka"},{"location":"developer_guide/architecture/architecture_overview/","text":"Architecture overview \u00b6 Normally, you would think that a tool like Dokka simply parses some programming language sources and generates HTML pages for whatever it sees along the way, with little to no abstractions. That would be the simplest and the most straightforward way to implement an API documentation engine. However, it was clear that Dokka may need to generate documentation from various sources (not only Kotlin), that users might request additional output formats (like Markdown), that users might need additional features like supporting custom KDoc tags or rendering mermaid.js diagrams - all these things would require changing a lot of code inside Dokka itself if all solutions were hardcoded. For this reason, Dokka was built from the ground up to be easily extensible and customizable by adding several layers of abstractions to the data model, and by providing pluggable extension points, giving you the ability to introduce selective changes on a given level. Overview of data model \u00b6 Generating API documentation begins with input source files ( .kt , .java , etc) and ends with some output files ( .html / .md , etc). However, to allow for extensibility and customization, several input and output independent abstractions have been added to the data model. Below you can find the general pipeline of processing data gathered from sources and the explanation for each stage. flowchart TD Input --> Documentables --> Pages --> Output Input - generalization of sources, by default Kotlin / Java sources, but could be virtually anything Documentables - unified data model that represents any parsed sources as a tree, independent of the source language. Examples of a Documentable : class, function, package, property, etc Pages - universal model that represents output pages (e.g a function/property page) and the content it's composed of (lists, text, code blocks) that the users needs to see. Not to be confused with .html pages. Goes hand in hand with the so-called Content model . Output - specific output formats like HTML / Markdown / Javadoc and so on. This is a mapping of the pages/content model to a human-readable and visual representation. For instance: PageNode is mapped as .html file for the HTML format .md file for the Markdown format ContentList is mapped as or with some CSS styles in the HTML format Text wrapped in triple backticks for the Markdown format You, as a Dokka developer or a plugin writer, can use extension points to introduce selective changes to the model on one particular level without altering everything else. For instance, if you wanted to make an annotation / function / class invisible in the final documentation, you would only need to modify the Documentables level by filtering undesirable declarations out. If you wanted to display all overloaded methods on the same page instead of on separate ones, you would only need to modify the Pages layer by merging multiple pages into one, and so on. For a deeper dive into Dokka's model with more examples and details, see sections about Documentables and Page/Content For an overview of existing extension points that let you transform Dokka's models, see Core extension points and Base extensions . Overview of extension points \u00b6 An extension point usually represents a pluggable interface that performs an action during one of the stages of generating documentation. An extension is, therefore, an implementation of the interface which is extending the extension point. You can create extension points, provide your own implementations (extensions) and configure them. All of this is possible with Dokka's plugin / extension point API. Here's a sneak peek of the DSL: // declare your own plugin class MyPlugin : DokkaPlugin () { // create an extension point for developers to use val signatureProvider by extensionPoint < SignatureProvider > () // provide a default implementation val defaultSignatureProvider by extending { signatureProvider with KotlinSignatureProvider () } // register our own extension in Dokka's Base plugin by overriding its default implementation val dokkaBasePlugin by lazy { plugin < DokkaBase > () } val multimoduleLocationProvider by extending { ( dokkaBasePlugin . locationProviderFactory providing MultimoduleLocationProvider :: Factory override dokkaBasePlugin . locationProvider ) } } class MyExtension ( val context : DokkaContext ) { // use an existing extension val signatureProvider : SignatureProvider = context . plugin < MyPlugin > (). querySingle { signatureProvider } fun doSomething () { signatureProvider . signature (..) } } interface SignatureProvider { fun signature ( documentable : Documentable ): List < ContentNode > } class KotlinSignatureProvider : SignatureProvider { override fun signature ( documentable : Documentable ): List < ContentNode > = listOf () } For a deeper dive into extensions and extension points, see Introduction to Extensions . For an overview of existing extension points, see Core extension points and Base extensions . Historical context \u00b6 This is a second iteration of Dokka that was built from scratch. If you want to learn more about why Dokka was redesigned this way, watch this great talk by Pawe\u0142 Marks: New Dokka - Designed for Fearless Creativity . The general principles and general architecture are the same, although it may be outdated in some areas, so please double-check.","title":"Architecture"},{"location":"developer_guide/architecture/architecture_overview/#architecture-overview","text":"Normally, you would think that a tool like Dokka simply parses some programming language sources and generates HTML pages for whatever it sees along the way, with little to no abstractions. That would be the simplest and the most straightforward way to implement an API documentation engine. However, it was clear that Dokka may need to generate documentation from various sources (not only Kotlin), that users might request additional output formats (like Markdown), that users might need additional features like supporting custom KDoc tags or rendering mermaid.js diagrams - all these things would require changing a lot of code inside Dokka itself if all solutions were hardcoded. For this reason, Dokka was built from the ground up to be easily extensible and customizable by adding several layers of abstractions to the data model, and by providing pluggable extension points, giving you the ability to introduce selective changes on a given level.","title":"Architecture overview"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-data-model","text":"Generating API documentation begins with input source files ( .kt , .java , etc) and ends with some output files ( .html / .md , etc). However, to allow for extensibility and customization, several input and output independent abstractions have been added to the data model. Below you can find the general pipeline of processing data gathered from sources and the explanation for each stage. flowchart TD Input --> Documentables --> Pages --> Output Input - generalization of sources, by default Kotlin / Java sources, but could be virtually anything Documentables - unified data model that represents any parsed sources as a tree, independent of the source language. Examples of a Documentable : class, function, package, property, etc Pages - universal model that represents output pages (e.g a function/property page) and the content it's composed of (lists, text, code blocks) that the users needs to see. Not to be confused with .html pages. Goes hand in hand with the so-called Content model . Output - specific output formats like HTML / Markdown / Javadoc and so on. This is a mapping of the pages/content model to a human-readable and visual representation. For instance: PageNode is mapped as .html file for the HTML format .md file for the Markdown format ContentList is mapped as
- /
for the HTML format 1. / * for the Markdown format ContentCodeBlock is mapped as or with some CSS styles in the HTML format Text wrapped in triple backticks for the Markdown format You, as a Dokka developer or a plugin writer, can use extension points to introduce selective changes to the model on one particular level without altering everything else. For instance, if you wanted to make an annotation / function / class invisible in the final documentation, you would only need to modify the Documentables level by filtering undesirable declarations out. If you wanted to display all overloaded methods on the same page instead of on separate ones, you would only need to modify the Pages layer by merging multiple pages into one, and so on. For a deeper dive into Dokka's model with more examples and details, see sections about Documentables and Page/Content For an overview of existing extension points that let you transform Dokka's models, see Core extension points and Base extensions .","title":"Overview of data model"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-extension-points","text":"An extension point usually represents a pluggable interface that performs an action during one of the stages of generating documentation. An extension is, therefore, an implementation of the interface which is extending the extension point. You can create extension points, provide your own implementations (extensions) and configure them. All of this is possible with Dokka's plugin / extension point API. Here's a sneak peek of the DSL: // declare your own plugin class MyPlugin : DokkaPlugin () { // create an extension point for developers to use val signatureProvider by extensionPoint < SignatureProvider > () // provide a default implementation val defaultSignatureProvider by extending { signatureProvider with KotlinSignatureProvider () } // register our own extension in Dokka's Base plugin by overriding its default implementation val dokkaBasePlugin by lazy { plugin < DokkaBase > () } val multimoduleLocationProvider by extending { ( dokkaBasePlugin . locationProviderFactory providing MultimoduleLocationProvider :: Factory override dokkaBasePlugin . locationProvider ) } } class MyExtension ( val context : DokkaContext ) { // use an existing extension val signatureProvider : SignatureProvider = context . plugin < MyPlugin > (). querySingle { signatureProvider } fun doSomething () { signatureProvider . signature (..) } } interface SignatureProvider { fun signature ( documentable : Documentable ): List < ContentNode > } class KotlinSignatureProvider : SignatureProvider { override fun signature ( documentable : Documentable ): List < ContentNode > = listOf () } For a deeper dive into extensions and extension points, see Introduction to Extensions . For an overview of existing extension points, see Core extension points and Base extensions .","title":"Overview of extension points"},{"location":"developer_guide/architecture/architecture_overview/#historical-context","text":"This is a second iteration of Dokka that was built from scratch. If you want to learn more about why Dokka was redesigned this way, watch this great talk by Pawe\u0142 Marks: New Dokka - Designed for Fearless Creativity . The general principles and general architecture are the same, although it may be outdated in some areas, so please double-check.","title":"Historical context"},{"location":"developer_guide/architecture/data_model/documentable_model/","text":"Documentable Model \u00b6 The Documentable model represents the data that is parsed from some programming language sources. Think of this data as of something that could be seen or produced by a compiler frontend, it's not far off from the truth. By default, the documentables are created from: Descriptors (Kotlin's K1 compiler) Symbols (Kotlin's K2 compiler) PSI (Java's model). Code-wise, you can have a look at following classes: DefaultDescriptorToDocumentableTranslator - responsible for Kotlin -> Documentable mapping DefaultPsiToDocumentableTranslator - responsible for Java -> Documentable mapping Upon creation, the documentable model represents a collection of trees, each with DModule as root. Take some arbitrary Kotlin source code that is located within the same module: // Package 1 class Clazz ( val property : String ) { fun function ( parameter : String ) {} } fun topLevelFunction () {} // Package 2 enum class Enum { } val topLevelProperty : String This would be represented roughly as the following Documentable tree: flowchart TD DModule --> firstPackage[DPackage] firstPackage --> DClass firstPackage --> toplevelfunction[DFunction] DClass --> DProperty DClass --> DFunction DFunction --> DParameter DModule --> secondPackage[DPackage] secondPackage --> DEnum secondPackage --> secondPackageProperty[DProperty] At later stages of transformation, all trees are folded into one by DocumentableMerger . Documentable \u00b6 The main building block of the documentable model is the Documentable class. It is the base class for all more specific types. All implementations represent elements of source code with mostly self-explanatory names: DFunction , DPackage , DProperty , and so on. DClasslike is the base class for all class-like documentables, such as DClass , DEnum , DAnnotation and others. The contents of each documentable normally represent what you would see in the source code. For example, if you open DClass , you should find that it contains references to functions, properties, companion objects, constructors and so on. DEnum should have references to its entries, and DPackage can have references to both classlikes and top-level functions and properties (Kotlin-specific). Here's an example of a documentable: data class DClass ( val dri : DRI , val name : String , val constructors : List < DFunction > , val functions : List < DFunction > , val properties : List < DProperty > , val classlikes : List < DClasslike > , val sources : SourceSetDependent < DocumentableSource > , val visibility : SourceSetDependent < Visibility > , val companion : DObject?, val generics : List < DTypeParameter > , val supertypes : SourceSetDependent < List < TypeConstructorWithKind >> , val documentation : SourceSetDependent < DocumentationNode > , val expectPresentInSet : DokkaSourceSet?, val modifier : SourceSetDependent < Modifier > , val sourceSets : Set < DokkaSourceSet > , val isExpectActual : Boolean , val extra : PropertyContainer < DClass > = PropertyContainer . empty () ) : DClasslike (), WithAbstraction , WithCompanion , WithConstructors , WithGenerics , WithSupertypes , WithExtraProperties < DClass > There are three non-documentable classes that are important for this model: DRI SourceSetDependent ExtraProperty . DRI \u00b6 DRI stans for Dokka Resource Identifier - a unique value that identifies a specific Documentable . All references and relations between the documentables (other than direct ownership) are described using DRI . For example, DFunction with a parameter of type Foo only has Foo 's DRI , but not the actual reference to Foo 's Documentable object. Example \u00b6 For an example of how a DRI can look like, let's take the limitedParallelism function from kotlinx.coroutines : package kotlinx.coroutines import ... public abstract class MainCoroutineDispatcher : CoroutineDispatcher () { override fun limitedParallelism ( parallelism : Int ): CoroutineDispatcher { ... } } If we were to re-create the DRI of this function in code, it would look something like this: DRI ( packageName = \"kotlinx.coroutines\" , classNames = \"MainCoroutineDispatcher\" , callable = Callable ( name = \"limitedParallelism\" , receiver = null , params = listOf ( TypeConstructor ( fullyQualifiedName = \"kotlin.Int\" , params = emptyList () ) ) ), target = PointingToDeclaration , extra = null ) If you format it as String , it would look like this: kotlinx.coroutines/MainCoroutineDispatcher/limitedParallelism/#kotlin.Int/PointingToDeclaration/ SourceSetDependent \u00b6 SourceSetDependent helps handling multiplatform data by associating platform-specific data (declared with either expect or actual modifiers) with particular source sets . This comes in handy if the expect / actual declarations differ. For example, the default value for actual might differ from that declared in expect , or code comments written for expect might be different from what's written for actual . Under the hood, it's a typealias to a Map : typealias SourceSetDependent < T > = Map < DokkaSourceSet , T > ExtraProperty \u00b6 ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. This element is a bit more complex, so you can read more about how to use it in a separate section . Documentation model \u00b6 The Documentation model is used alongside documentables to store data obtained by parsing code comments (such as KDocs / Javadocs). DocTag \u00b6 DocTag describes a specific documentation syntax element. It's universal across language sources. For example, the DocTag B is the same for **bold** in Kotlin and bold in Java. However, some DocTag elements are specific to one language. There are many such examples for Java, because it allows HTML tags inside the Javadoc comments, some of which are simply not possible to reproduce with Markdown that KDocs use. DocTag elements can be deeply nested with other DocTag children elements. Examples: data class H1 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class H2 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strikethrough ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strong ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class CodeBlock ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : Code () TagWrapper \u00b6 TagWrapper describes the whole comment description or a specific comment tag. For example: @see / @author / @return . Since each such section may contain formatted text inside it, each TagWrapper has DocTag children. /** * @author **Ben Affleck* * @return nothing, except _sometimes_ it may throw an [Error] */ fun foo () {} DocumentationNode \u00b6 DocumentationNode acts as a container for multiple TagWrapper elements for a specific Documentable , usually used like this: data class DFunction ( ... val documentation : SourceSetDependent < DocumentationNode > , ... )","title":"Documentables"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentable-model","text":"The Documentable model represents the data that is parsed from some programming language sources. Think of this data as of something that could be seen or produced by a compiler frontend, it's not far off from the truth. By default, the documentables are created from: Descriptors (Kotlin's K1 compiler) Symbols (Kotlin's K2 compiler) PSI (Java's model). Code-wise, you can have a look at following classes: DefaultDescriptorToDocumentableTranslator - responsible for Kotlin -> Documentable mapping DefaultPsiToDocumentableTranslator - responsible for Java -> Documentable mapping Upon creation, the documentable model represents a collection of trees, each with DModule as root. Take some arbitrary Kotlin source code that is located within the same module: // Package 1 class Clazz ( val property : String ) { fun function ( parameter : String ) {} } fun topLevelFunction () {} // Package 2 enum class Enum { } val topLevelProperty : String This would be represented roughly as the following Documentable tree: flowchart TD DModule --> firstPackage[DPackage] firstPackage --> DClass firstPackage --> toplevelfunction[DFunction] DClass --> DProperty DClass --> DFunction DFunction --> DParameter DModule --> secondPackage[DPackage] secondPackage --> DEnum secondPackage --> secondPackageProperty[DProperty] At later stages of transformation, all trees are folded into one by DocumentableMerger .","title":"Documentable Model"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentable","text":"The main building block of the documentable model is the Documentable class. It is the base class for all more specific types. All implementations represent elements of source code with mostly self-explanatory names: DFunction , DPackage , DProperty , and so on. DClasslike is the base class for all class-like documentables, such as DClass , DEnum , DAnnotation and others. The contents of each documentable normally represent what you would see in the source code. For example, if you open DClass , you should find that it contains references to functions, properties, companion objects, constructors and so on. DEnum should have references to its entries, and DPackage can have references to both classlikes and top-level functions and properties (Kotlin-specific). Here's an example of a documentable: data class DClass ( val dri : DRI , val name : String , val constructors : List < DFunction > , val functions : List < DFunction > , val properties : List < DProperty > , val classlikes : List < DClasslike > , val sources : SourceSetDependent < DocumentableSource > , val visibility : SourceSetDependent < Visibility > , val companion : DObject?, val generics : List < DTypeParameter > , val supertypes : SourceSetDependent < List < TypeConstructorWithKind >> , val documentation : SourceSetDependent < DocumentationNode > , val expectPresentInSet : DokkaSourceSet?, val modifier : SourceSetDependent < Modifier > , val sourceSets : Set < DokkaSourceSet > , val isExpectActual : Boolean , val extra : PropertyContainer < DClass > = PropertyContainer . empty () ) : DClasslike (), WithAbstraction , WithCompanion , WithConstructors , WithGenerics , WithSupertypes , WithExtraProperties < DClass > There are three non-documentable classes that are important for this model: DRI SourceSetDependent ExtraProperty .","title":"Documentable"},{"location":"developer_guide/architecture/data_model/documentable_model/#dri","text":"DRI stans for Dokka Resource Identifier - a unique value that identifies a specific Documentable . All references and relations between the documentables (other than direct ownership) are described using DRI . For example, DFunction with a parameter of type Foo only has Foo 's DRI , but not the actual reference to Foo 's Documentable object.","title":"DRI"},{"location":"developer_guide/architecture/data_model/documentable_model/#example","text":"For an example of how a DRI can look like, let's take the limitedParallelism function from kotlinx.coroutines : package kotlinx.coroutines import ... public abstract class MainCoroutineDispatcher : CoroutineDispatcher () { override fun limitedParallelism ( parallelism : Int ): CoroutineDispatcher { ... } } If we were to re-create the DRI of this function in code, it would look something like this: DRI ( packageName = \"kotlinx.coroutines\" , classNames = \"MainCoroutineDispatcher\" , callable = Callable ( name = \"limitedParallelism\" , receiver = null , params = listOf ( TypeConstructor ( fullyQualifiedName = \"kotlin.Int\" , params = emptyList () ) ) ), target = PointingToDeclaration , extra = null ) If you format it as String , it would look like this: kotlinx.coroutines/MainCoroutineDispatcher/limitedParallelism/#kotlin.Int/PointingToDeclaration/","title":"Example"},{"location":"developer_guide/architecture/data_model/documentable_model/#sourcesetdependent","text":"SourceSetDependent helps handling multiplatform data by associating platform-specific data (declared with either expect or actual modifiers) with particular source sets . This comes in handy if the expect / actual declarations differ. For example, the default value for actual might differ from that declared in expect , or code comments written for expect might be different from what's written for actual . Under the hood, it's a typealias to a Map : typealias SourceSetDependent < T > = Map < DokkaSourceSet , T >","title":"SourceSetDependent"},{"location":"developer_guide/architecture/data_model/documentable_model/#extraproperty","text":"ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. This element is a bit more complex, so you can read more about how to use it in a separate section .","title":"ExtraProperty"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentation-model","text":"The Documentation model is used alongside documentables to store data obtained by parsing code comments (such as KDocs / Javadocs).","title":"Documentation model"},{"location":"developer_guide/architecture/data_model/documentable_model/#doctag","text":"DocTag describes a specific documentation syntax element. It's universal across language sources. For example, the DocTag B is the same for **bold** in Kotlin and bold in Java. However, some DocTag elements are specific to one language. There are many such examples for Java, because it allows HTML tags inside the Javadoc comments, some of which are simply not possible to reproduce with Markdown that KDocs use. DocTag elements can be deeply nested with other DocTag children elements. Examples: data class H1 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class H2 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strikethrough ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strong ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class CodeBlock ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : Code ()","title":"DocTag"},{"location":"developer_guide/architecture/data_model/documentable_model/#tagwrapper","text":"TagWrapper describes the whole comment description or a specific comment tag. For example: @see / @author / @return . Since each such section may contain formatted text inside it, each TagWrapper has DocTag children. /** * @author **Ben Affleck* * @return nothing, except _sometimes_ it may throw an [Error] */ fun foo () {}","title":"TagWrapper"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentationnode","text":"DocumentationNode acts as a container for multiple TagWrapper elements for a specific Documentable , usually used like this: data class DFunction ( ... val documentation : SourceSetDependent < DocumentationNode > , ... )","title":"DocumentationNode"},{"location":"developer_guide/architecture/data_model/extra/","text":"Extra \u00b6 Introduction \u00b6 ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. ExtraProperty classes are available both in the Documentable and the Content models. To create a new extra, you need to implement the ExtraProperty interface. It is advised to use the following pattern when declaring new extras: data class CustomExtra ( [ any data relevant to your extra ] , [ any data relevant to your extra ] ): ExtraProperty < Documentable > { override val key : CustomExtra . Key < Documentable , *> = CustomExtra companion object : CustomExtra . Key < Documentable , CustomExtra > } Merge strategy (the mergeStrategyFor method) for extras is invoked during the merging of the documentables from different source sets , when the documentables being merged have their own Extra of the same type. PropertyContainer \u00b6 All extras for ContentNode and Documentable classes are stored in the PropertyContainer class instances. data class DFunction ( ... override val extra : PropertyContainer < DFunction > = PropertyContainer . empty () ... ) : WithExtraProperties < DFunction > PropertyContainer has a number of convenient functions for handling extras in a collection-like manner. The generic class parameter C limits the types of properties that can be stored in the container - it must match the generic C class parameter from the ExtraProperty interface. This allows creating extra properties which can only be stored in a specific Documentable . Usage example \u00b6 In following example we will create a DFunction -only extra property, store it and then retrieve its value: // Extra that is applicable only to DFunction data class CustomExtra ( val customExtraValue : String ) : ExtraProperty < DFunction > { override val key : ExtraProperty . Key < Documentable , *> = CustomExtra companion object : ExtraProperty . Key < Documentable , CustomExtra > } // Storing it inside the documentable fun DFunction . withCustomExtraProperty ( data : String ): DFunction { return this . copy ( extra = extra + CustomExtra ( data ) ) } // Retrieveing it from the documentable fun DFunction . getCustomExtraPropertyValue (): String? { return this . extra [ CustomExtra ]?. customExtraValue } You can also use extras as markers, without storing any data in them: object MarkerExtra : ExtraProperty < Any > , ExtraProperty . Key < Any , MarkerExtra > { override val key : ExtraProperty . Key < Any , *> = this } fun Documentable . markIfFunction (): Documentable { return when ( this ) { is DFunction -> this . copy ( extra = extra + MarkerExtra ) else -> this } } fun WithExtraProperties < Documentable > . isMarked (): Boolean { return this . extra [ MarkerExtra ] != null }","title":"Extra properties"},{"location":"developer_guide/architecture/data_model/extra/#extra","text":"","title":"Extra"},{"location":"developer_guide/architecture/data_model/extra/#introduction","text":"ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. ExtraProperty classes are available both in the Documentable and the Content models. To create a new extra, you need to implement the ExtraProperty interface. It is advised to use the following pattern when declaring new extras: data class CustomExtra ( [ any data relevant to your extra ] , [ any data relevant to your extra ] ): ExtraProperty < Documentable > { override val key : CustomExtra . Key < Documentable , *> = CustomExtra companion object : CustomExtra . Key < Documentable , CustomExtra > } Merge strategy (the mergeStrategyFor method) for extras is invoked during the merging of the documentables from different source sets , when the documentables being merged have their own Extra of the same type.","title":"Introduction"},{"location":"developer_guide/architecture/data_model/extra/#propertycontainer","text":"All extras for ContentNode and Documentable classes are stored in the PropertyContainer class instances. data class DFunction ( ... override val extra : PropertyContainer < DFunction > = PropertyContainer . empty () ... ) : WithExtraProperties < DFunction > PropertyContainer has a number of convenient functions for handling extras in a collection-like manner. The generic class parameter C limits the types of properties that can be stored in the container - it must match the generic C class parameter from the ExtraProperty interface. This allows creating extra properties which can only be stored in a specific Documentable .","title":"PropertyContainer"},{"location":"developer_guide/architecture/data_model/extra/#usage-example","text":"In following example we will create a DFunction -only extra property, store it and then retrieve its value: // Extra that is applicable only to DFunction data class CustomExtra ( val customExtraValue : String ) : ExtraProperty < DFunction > { override val key : ExtraProperty . Key < Documentable , *> = CustomExtra companion object : ExtraProperty . Key < Documentable , CustomExtra > } // Storing it inside the documentable fun DFunction . withCustomExtraProperty ( data : String ): DFunction { return this . copy ( extra = extra + CustomExtra ( data ) ) } // Retrieveing it from the documentable fun DFunction . getCustomExtraPropertyValue (): String? { return this . extra [ CustomExtra ]?. customExtraValue } You can also use extras as markers, without storing any data in them: object MarkerExtra : ExtraProperty < Any > , ExtraProperty . Key < Any , MarkerExtra > { override val key : ExtraProperty . Key < Any , *> = this } fun Documentable . markIfFunction (): Documentable { return when ( this ) { is DFunction -> this . copy ( extra = extra + MarkerExtra ) else -> this } } fun WithExtraProperties < Documentable > . isMarked (): Boolean { return this . extra [ MarkerExtra ] != null }","title":"Usage example"},{"location":"developer_guide/architecture/data_model/page_content/","text":"Page / Content Model \u00b6 Even though the Page and Content models reside on the same level (under Page ), it is easier to view them as two different models altogether, even though Content is only used in conjunction with and inside the Page model only. Page \u00b6 The Page model represents the structure of documentation pages to be generated. During rendering, each page is processed separately, so one page corresponds to exactly one output file. The Page model is independent of the final output format. In other words, it's universal. Which file extension the pages should be created as ( .html , .md , etc), and how, is up to the Renderer extension. Subclasses of the PageNode class represent the different kinds of pages, such as ModulePage , PackagePage , ClasslikePage , MemberPage and so on. The Page model can be represented as a tree, with RootPageNode at the root. Here's an example of how an arbitrary project's Page tree might look like, if the project consists of a module with 3 packages, one of which contains a top level function, a top level property and a class, inside which there's a function and a property: flowchart TD RootPageNode --> firstPackage[PackagePageNode] RootPageNode --> secondPackage[PackagePageNode] RootPageNode --> thirdPackage[PackagePageNode] firstPackage --> firstPackageFirstMember[MemberPageNode - Function] firstPackage --> firstPackageSecondMember[MemberPageNode - Property] firstPackage ---> firstPackageClasslike[ClasslikePageNode - Class] firstPackageClasslike --> firstPackageClasslikeFirstMember[MemberPageNode - Function] firstPackageClasslike --> firstPackageClasslikeSecondMember[MemberPageNode - Property] secondPackage --> etcOne[...] thirdPackage --> etcTwo[...] Almost all pages are derivatives of ContentPage - it's the type of a page that has user-visible content on it. Content Model \u00b6 The Content model describes what the pages consist of. It is essentially a set of building blocks that you can put together to represent some content. It is also output-format independent and universal. For an example, have a look at the subclasses of ContentNode : ContentText , ContentList , ContentTable , ContentCodeBlock , ContentHeader and so on -- all self-explanatory. You can group chunks of content together with ContentGroup - for example, to wrap all children with a style. // real example of composing content using the `DocumentableContentBuilder` DSL orderedList { item { text ( \"This list contains a nested table:\" ) table { header { text ( \"Col1\" ) text ( \"Col2\" ) } row { text ( \"Text1\" ) text ( \"Text2\" ) } } } item { group ( styles = setOf ( TextStyle . Bold )) { text ( \"This is bald\" ) text ( \"This is also bald\" ) } } } It is the responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it, be it visually (html pages) or otherwise (json). For instance, HtmlRenderer might render ContentCodeBlock as text
, but CommonmarkRenderer might render it using backticks. DCI \u00b6 Each node is identified by a unique DCI , which stands for Dokka Content Identifier . DCI aggregates DRI s of all documentables that are used by the given ContentNode . data class DCI ( val dri : Set < DRI > , val kind : Kind ) All references to other nodes (other than direct ownership) are described using DCI . ContentKind \u00b6 ContentKind represents a grouping of content of one kind that can be rendered as part of a composite page, like a single one tab or a block within a class's page. For example, on the same page that describes a class you can have multiple sections (== ContentKind s). One to describe functions, one to describe properties, another one to describe the constructors, and so on. Styles \u00b6 Each ContentNode has a styles property in case you want to indicate to the Renderer that this content needs to be rendered in a certain way. group ( styles = setOf ( TextStyle . Paragraph )) { text ( \"Text1\" , styles = setOf ( TextStyle . Bold )) text ( \"Text2\" , styles = setOf ( TextStyle . Italic )) } It is responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it. For instance, HtmlRenderer might render TextStyle.Bold as text , but CommonmarkRenderer might render it as **text** . There's a number of existing styles that you can use, most of them are supported by the HtmlRenderer extension out of the box: // for code highlighting enum class TokenStyle : Style { Keyword , Punctuation , Function , Operator , Annotation , Number , String , Boolean , Constant , Builtin , ... } enum class TextStyle : Style { Bold , Italic , Strong , Strikethrough , Paragraph , ... } enum class ContentStyle : Style { TabbedContent , RunnableSample , Wrapped , Indented , ... } Extra \u00b6 ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. All ExtraProperty elements from the Documentable model are propagated into the Content model, and are available in the Renderer extensions. This element is a bit complex, so you can read more about how to use it in a separate section .","title":"Page & Content"},{"location":"developer_guide/architecture/data_model/page_content/#page-content-model","text":"Even though the Page and Content models reside on the same level (under Page ), it is easier to view them as two different models altogether, even though Content is only used in conjunction with and inside the Page model only.","title":"Page / Content Model"},{"location":"developer_guide/architecture/data_model/page_content/#page","text":"The Page model represents the structure of documentation pages to be generated. During rendering, each page is processed separately, so one page corresponds to exactly one output file. The Page model is independent of the final output format. In other words, it's universal. Which file extension the pages should be created as ( .html , .md , etc), and how, is up to the Renderer extension. Subclasses of the PageNode class represent the different kinds of pages, such as ModulePage , PackagePage , ClasslikePage , MemberPage and so on. The Page model can be represented as a tree, with RootPageNode at the root. Here's an example of how an arbitrary project's Page tree might look like, if the project consists of a module with 3 packages, one of which contains a top level function, a top level property and a class, inside which there's a function and a property: flowchart TD RootPageNode --> firstPackage[PackagePageNode] RootPageNode --> secondPackage[PackagePageNode] RootPageNode --> thirdPackage[PackagePageNode] firstPackage --> firstPackageFirstMember[MemberPageNode - Function] firstPackage --> firstPackageSecondMember[MemberPageNode - Property] firstPackage ---> firstPackageClasslike[ClasslikePageNode - Class] firstPackageClasslike --> firstPackageClasslikeFirstMember[MemberPageNode - Function] firstPackageClasslike --> firstPackageClasslikeSecondMember[MemberPageNode - Property] secondPackage --> etcOne[...] thirdPackage --> etcTwo[...] Almost all pages are derivatives of ContentPage - it's the type of a page that has user-visible content on it.","title":"Page"},{"location":"developer_guide/architecture/data_model/page_content/#content-model","text":"The Content model describes what the pages consist of. It is essentially a set of building blocks that you can put together to represent some content. It is also output-format independent and universal. For an example, have a look at the subclasses of ContentNode : ContentText , ContentList , ContentTable , ContentCodeBlock , ContentHeader and so on -- all self-explanatory. You can group chunks of content together with ContentGroup - for example, to wrap all children with a style. // real example of composing content using the `DocumentableContentBuilder` DSL orderedList { item { text ( \"This list contains a nested table:\" ) table { header { text ( \"Col1\" ) text ( \"Col2\" ) } row { text ( \"Text1\" ) text ( \"Text2\" ) } } } item { group ( styles = setOf ( TextStyle . Bold )) { text ( \"This is bald\" ) text ( \"This is also bald\" ) } } } It is the responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it, be it visually (html pages) or otherwise (json). For instance, HtmlRenderer might render ContentCodeBlock as text
, but CommonmarkRenderer might render it using backticks.","title":"Content Model"},{"location":"developer_guide/architecture/data_model/page_content/#dci","text":"Each node is identified by a unique DCI , which stands for Dokka Content Identifier . DCI aggregates DRI s of all documentables that are used by the given ContentNode . data class DCI ( val dri : Set < DRI > , val kind : Kind ) All references to other nodes (other than direct ownership) are described using DCI .","title":"DCI"},{"location":"developer_guide/architecture/data_model/page_content/#contentkind","text":"ContentKind represents a grouping of content of one kind that can be rendered as part of a composite page, like a single one tab or a block within a class's page. For example, on the same page that describes a class you can have multiple sections (== ContentKind s). One to describe functions, one to describe properties, another one to describe the constructors, and so on.","title":"ContentKind"},{"location":"developer_guide/architecture/data_model/page_content/#styles","text":"Each ContentNode has a styles property in case you want to indicate to the Renderer that this content needs to be rendered in a certain way. group ( styles = setOf ( TextStyle . Paragraph )) { text ( \"Text1\" , styles = setOf ( TextStyle . Bold )) text ( \"Text2\" , styles = setOf ( TextStyle . Italic )) } It is responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it. For instance, HtmlRenderer might render TextStyle.Bold as text , but CommonmarkRenderer might render it as **text** . There's a number of existing styles that you can use, most of them are supported by the HtmlRenderer extension out of the box: // for code highlighting enum class TokenStyle : Style { Keyword , Punctuation , Function , Operator , Annotation , Number , String , Boolean , Constant , Builtin , ... } enum class TextStyle : Style { Bold , Italic , Strong , Strikethrough , Paragraph , ... } enum class ContentStyle : Style { TabbedContent , RunnableSample , Wrapped , Indented , ... }","title":"Styles"},{"location":"developer_guide/architecture/data_model/page_content/#extra","text":"ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. All ExtraProperty elements from the Documentable model are propagated into the Content model, and are available in the Renderer extensions. This element is a bit complex, so you can read more about how to use it in a separate section .","title":"Extra"},{"location":"developer_guide/architecture/extension_points/base_plugin/","text":"Base plugin \u00b6 DokkaBase represents Dokka's Base plugin, which provides a number of sensible default implementations for CoreExtensions , as well as declares its own, more high-level abstractions and extension points to be used from other plugins and output formats. If you want to develop a simple plugin that only changes a few details, it is very convenient to rely on default implementations and use extension points defined in DokkaBase , as it reduces the scope of changes you need to make. DokkaBase is used extensively in Dokka's own output formats. You can learn how to add, use, override and configure extensions and extension points in Introduction to Extensions - all of that information is applicable to the DokkaBase plugin as well. Extension points \u00b6 Some notable extension points defined in Dokka's Base plugin. PreMergeDocumentableTransformer \u00b6 PreMergeDocumentableTransformer is very similar to the DocumentableTransformer core extension point, but it is used during an earlier stage by the Single module generation . This extension point allows you to apply any transformations to the Documentables model before the project's source sets are merged. It is useful if you want to filter/map existing documentables. For example, if you want to exclude members annotated with @Internal , you most likely need an implementation of PreMergeDocumentableTransformer . For simple condition-based filtering of documentables, consider extending SuppressedByConditionDocumentableFilterTransformer - it implements PreMergeDocumentableTransformer and only requires one function to be overridden, whereas the rest is taken care of.","title":"Base extensions"},{"location":"developer_guide/architecture/extension_points/base_plugin/#base-plugin","text":"DokkaBase represents Dokka's Base plugin, which provides a number of sensible default implementations for CoreExtensions , as well as declares its own, more high-level abstractions and extension points to be used from other plugins and output formats. If you want to develop a simple plugin that only changes a few details, it is very convenient to rely on default implementations and use extension points defined in DokkaBase , as it reduces the scope of changes you need to make. DokkaBase is used extensively in Dokka's own output formats. You can learn how to add, use, override and configure extensions and extension points in Introduction to Extensions - all of that information is applicable to the DokkaBase plugin as well.","title":"Base plugin"},{"location":"developer_guide/architecture/extension_points/base_plugin/#extension-points","text":"Some notable extension points defined in Dokka's Base plugin.","title":"Extension points"},{"location":"developer_guide/architecture/extension_points/base_plugin/#premergedocumentabletransformer","text":"PreMergeDocumentableTransformer is very similar to the DocumentableTransformer core extension point, but it is used during an earlier stage by the Single module generation . This extension point allows you to apply any transformations to the Documentables model before the project's source sets are merged. It is useful if you want to filter/map existing documentables. For example, if you want to exclude members annotated with @Internal , you most likely need an implementation of PreMergeDocumentableTransformer . For simple condition-based filtering of documentables, consider extending SuppressedByConditionDocumentableFilterTransformer - it implements PreMergeDocumentableTransformer and only requires one function to be overridden, whereas the rest is taken care of.","title":"PreMergeDocumentableTransformer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/","text":"Core extension points \u00b6 Core extension points represent the main stages of generating documentation. These extension points are plugin and output format independent, meaning it's the very core functionality and as low-level as can get in Dokka. For higher-level extension functions that can be used in different output formats, have a look at the Base plugin . You can find all core extensions in the CoreExtensions class: object CoreExtensions { val preGenerationCheck by coreExtensionPoint < PreGenerationChecker > () val generation by coreExtensionPoint < Generation > () val sourceToDocumentableTranslator by coreExtensionPoint < SourceToDocumentableTranslator > () val documentableMerger by coreExtensionPoint < DocumentableMerger > () val documentableTransformer by coreExtensionPoint < DocumentableTransformer > () val documentableToPageTranslator by coreExtensionPoint < DocumentableToPageTranslator > () val pageTransformer by coreExtensionPoint < PageTransformer > () val renderer by coreExtensionPoint < Renderer > () val postActions by coreExtensionPoint < PostAction > () } On this page, we'll go over each extension point individually. PreGenerationChecker \u00b6 PreGenerationChecker can be used to run some checks and constraints. For example, Dokka's Javadoc plugin does not support generating documentation for multi-platform projects, so it uses PreGenerationChecker to check for multi-platform source sets , and fails if it finds any. Generation \u00b6 Generation is responsible for generating documentation as a whole, utilizing higher-level extensions and extension points where applicable. See Generation implementations to learn about the default implementations. SourceToDocumentableTranslator \u00b6 SourceToDocumentableTranslator translates any given sources into the Documentable model. Kotlin and Java sources are supported by default by the Base plugin , but you can analyze any language as long as you can map it to the Documentable model. For reference, see DefaultDescriptorToDocumentableTranslator for Kotlin sources translation DefaultPsiToDocumentableTranslator for Java sources translation DocumentableMerger \u00b6 DocumentableMerger merges all DModule instances into one. Only one extension of this type is expected to be registered. DocumentableTransformer \u00b6 DocumentableTransformer performs the same function as PreMergeDocumentableTransformer , but after merging source sets. Notable example is InheritorsExtractorTransformer , it extracts inheritance information from source sets and creates an inheritance map. DocumentableToPageTranslator \u00b6 DocumentableToPageTranslator is responsible for creating pages and their content. See Page / Content model page for more information and examples. Output formats can either use the same page structure or define their own. Only a single extension of this type is expected to be registered. PageTransformer \u00b6 PageTransformer is useful if you need to add, remove or modify generated pages or their content. Using this extension point, plugins like org.jetbrains.dokka:mathjax-pligin can add .js scripts to the HTML pages. If you want all overloaded functions to be rendered on the same page instead of separate ones, you can use PageTransformer to combine the pages into a single one. Renderer \u00b6 Renderer - defines the rules on how to render pages and their content: which files to create and how to display the content properly. Custom output format plugins should use the Renderer extension point. Notable examples are HtmlRenderer and CommonmarkRenderer . PostAction \u00b6 PostAction can be used for when you want to run some actions after the documentation has been generated - for example, if you want to move some files around or log some informational messages. Dokka's Versioning plugin utilizes PostAction to move generated documentation to the versioned directories.","title":"Core extension points"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#core-extension-points","text":"Core extension points represent the main stages of generating documentation. These extension points are plugin and output format independent, meaning it's the very core functionality and as low-level as can get in Dokka. For higher-level extension functions that can be used in different output formats, have a look at the Base plugin . You can find all core extensions in the CoreExtensions class: object CoreExtensions { val preGenerationCheck by coreExtensionPoint < PreGenerationChecker > () val generation by coreExtensionPoint < Generation > () val sourceToDocumentableTranslator by coreExtensionPoint < SourceToDocumentableTranslator > () val documentableMerger by coreExtensionPoint < DocumentableMerger > () val documentableTransformer by coreExtensionPoint < DocumentableTransformer > () val documentableToPageTranslator by coreExtensionPoint < DocumentableToPageTranslator > () val pageTransformer by coreExtensionPoint < PageTransformer > () val renderer by coreExtensionPoint < Renderer > () val postActions by coreExtensionPoint < PostAction > () } On this page, we'll go over each extension point individually.","title":"Core extension points"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pregenerationchecker","text":"PreGenerationChecker can be used to run some checks and constraints. For example, Dokka's Javadoc plugin does not support generating documentation for multi-platform projects, so it uses PreGenerationChecker to check for multi-platform source sets , and fails if it finds any.","title":"PreGenerationChecker"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#generation","text":"Generation is responsible for generating documentation as a whole, utilizing higher-level extensions and extension points where applicable. See Generation implementations to learn about the default implementations.","title":"Generation"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#sourcetodocumentabletranslator","text":"SourceToDocumentableTranslator translates any given sources into the Documentable model. Kotlin and Java sources are supported by default by the Base plugin , but you can analyze any language as long as you can map it to the Documentable model. For reference, see DefaultDescriptorToDocumentableTranslator for Kotlin sources translation DefaultPsiToDocumentableTranslator for Java sources translation","title":"SourceToDocumentableTranslator"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentablemerger","text":"DocumentableMerger merges all DModule instances into one. Only one extension of this type is expected to be registered.","title":"DocumentableMerger"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletransformer","text":"DocumentableTransformer performs the same function as PreMergeDocumentableTransformer , but after merging source sets. Notable example is InheritorsExtractorTransformer , it extracts inheritance information from source sets and creates an inheritance map.","title":"DocumentableTransformer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletopagetranslator","text":"DocumentableToPageTranslator is responsible for creating pages and their content. See Page / Content model page for more information and examples. Output formats can either use the same page structure or define their own. Only a single extension of this type is expected to be registered.","title":"DocumentableToPageTranslator"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pagetransformer","text":"PageTransformer is useful if you need to add, remove or modify generated pages or their content. Using this extension point, plugins like org.jetbrains.dokka:mathjax-pligin can add .js scripts to the HTML pages. If you want all overloaded functions to be rendered on the same page instead of separate ones, you can use PageTransformer to combine the pages into a single one.","title":"PageTransformer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#renderer","text":"Renderer - defines the rules on how to render pages and their content: which files to create and how to display the content properly. Custom output format plugins should use the Renderer extension point. Notable examples are HtmlRenderer and CommonmarkRenderer .","title":"Renderer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#postaction","text":"PostAction can be used for when you want to run some actions after the documentation has been generated - for example, if you want to move some files around or log some informational messages. Dokka's Versioning plugin utilizes PostAction to move generated documentation to the versioned directories.","title":"PostAction"},{"location":"developer_guide/architecture/extension_points/extension_points/","text":"Extension points \u00b6 In this section you can learn how to create new extension points, how to configure existing ones, and how to query for registered extensions when generating documentation. Declaring extension points \u00b6 If you are writing a plugin, you can create your own extension points that other developers (or you) can use in other plugins / parts of code. class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () } interface SampleExtensionPointInterface { fun doSomething ( input : Input ): List < Output > } class Input class Output Usually, you would want to provide some default implementations for your extension points. You can do that within the same plugin class by extending an extension point you've just created. See Extending from extension points for examples. Extending from extension points \u00b6 You can use extension points to provide your own implementations in order to customize a plugin's behaviour. If you want to provide an implementation for an extension point declared in an external plugin (including DokkaBase ), you can use plugin querying API to do that. The example below shows how to extend MyPlugin (that was created above) with an implementation of SampleExtensionPointInterface . class MyExtendedPlugin : DokkaPlugin () { val mySampleExtensionImplementation by extending { plugin < MyPlugin > (). sampleExtensionPoint with SampleExtensionImpl () } } class SampleExtensionImpl : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () } Alternatively, if it is your own plugin, you can do that within the same class as the extension point itself: open class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () val defaultSampleExtension by extending { sampleExtensionPoint with DefaultSampleExtension () } } class DefaultSampleExtension : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () } Providing \u00b6 If you need to have access to DokkaContext when creating an extension, you can use the providing keyword instead. val defaultSampleExtension by extending { sampleExtensionPoint providing { context -> // can use context to query other extensions or get configuration DefaultSampleExtension () } } You can read more on what you can do with context in Obtaining extension instance . Override \u00b6 By extending an extension point, you are registering an additional extension. This behaviour is expected by some extension points, for example the Documentable transformers, because all registered transformer extensions do their own transformations independently and one after the other. However, a plugin can expect only a single extension to be registered for an extension point. In this case, you can use the override keyword to override the existing registered extension: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { ( myPlugin . sampleExtensionPoint with SampleExtensionImpl () override myPlugin . defaultSampleExtension ) } } This is also useful if you wish to override some extension from DokkaBase , to disable or alter it. Order \u00b6 Sometimes, the order in which extensions are invoked matters. This is something you can control as well using the order construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () order { before ( myPlugin . firstExtension ) after ( myPlugin . thirdExtension ) } } } Conditional apply \u00b6 If you want your extension to be registered only if some condition is true , you can use the applyIf construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () applyIf { Random . Default . nextBoolean () } } } Obtaining extension instance \u00b6 After an extension point has been created and some extensions have been registered , you can use query and querySingle functions to find all or just a single implementation. class MyExtension ( context : DokkaContext ) { // returns all registered extensions for the extension point val allSampleExtensions = context . plugin < MyPlugin > (). query { sampleExtensionPoint } // will throw an exception if more than one extension is found. // use if you expect only a single extension to be registered for the extension point val singleSampleExtensions = context . plugin < MyPlugin > (). querySingle { sampleExtensionPoint } fun invoke () { allSampleExtensions . forEach { it . doSomething ( Input ()) } singleSampleExtensions . doSomething ( Input ()) } } In order to have access to DokkaContext , you can use the providing keyword when registering an extension.","title":"Extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#extension-points","text":"In this section you can learn how to create new extension points, how to configure existing ones, and how to query for registered extensions when generating documentation.","title":"Extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#declaring-extension-points","text":"If you are writing a plugin, you can create your own extension points that other developers (or you) can use in other plugins / parts of code. class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () } interface SampleExtensionPointInterface { fun doSomething ( input : Input ): List < Output > } class Input class Output Usually, you would want to provide some default implementations for your extension points. You can do that within the same plugin class by extending an extension point you've just created. See Extending from extension points for examples.","title":"Declaring extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#extending-from-extension-points","text":"You can use extension points to provide your own implementations in order to customize a plugin's behaviour. If you want to provide an implementation for an extension point declared in an external plugin (including DokkaBase ), you can use plugin querying API to do that. The example below shows how to extend MyPlugin (that was created above) with an implementation of SampleExtensionPointInterface . class MyExtendedPlugin : DokkaPlugin () { val mySampleExtensionImplementation by extending { plugin < MyPlugin > (). sampleExtensionPoint with SampleExtensionImpl () } } class SampleExtensionImpl : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () } Alternatively, if it is your own plugin, you can do that within the same class as the extension point itself: open class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () val defaultSampleExtension by extending { sampleExtensionPoint with DefaultSampleExtension () } } class DefaultSampleExtension : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () }","title":"Extending from extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#providing","text":"If you need to have access to DokkaContext when creating an extension, you can use the providing keyword instead. val defaultSampleExtension by extending { sampleExtensionPoint providing { context -> // can use context to query other extensions or get configuration DefaultSampleExtension () } } You can read more on what you can do with context in Obtaining extension instance .","title":"Providing"},{"location":"developer_guide/architecture/extension_points/extension_points/#override","text":"By extending an extension point, you are registering an additional extension. This behaviour is expected by some extension points, for example the Documentable transformers, because all registered transformer extensions do their own transformations independently and one after the other. However, a plugin can expect only a single extension to be registered for an extension point. In this case, you can use the override keyword to override the existing registered extension: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { ( myPlugin . sampleExtensionPoint with SampleExtensionImpl () override myPlugin . defaultSampleExtension ) } } This is also useful if you wish to override some extension from DokkaBase , to disable or alter it.","title":"Override"},{"location":"developer_guide/architecture/extension_points/extension_points/#order","text":"Sometimes, the order in which extensions are invoked matters. This is something you can control as well using the order construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () order { before ( myPlugin . firstExtension ) after ( myPlugin . thirdExtension ) } } }","title":"Order"},{"location":"developer_guide/architecture/extension_points/extension_points/#conditional-apply","text":"If you want your extension to be registered only if some condition is true , you can use the applyIf construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () applyIf { Random . Default . nextBoolean () } } }","title":"Conditional apply"},{"location":"developer_guide/architecture/extension_points/extension_points/#obtaining-extension-instance","text":"After an extension point has been created and some extensions have been registered , you can use query and querySingle functions to find all or just a single implementation. class MyExtension ( context : DokkaContext ) { // returns all registered extensions for the extension point val allSampleExtensions = context . plugin < MyPlugin > (). query { sampleExtensionPoint } // will throw an exception if more than one extension is found. // use if you expect only a single extension to be registered for the extension point val singleSampleExtensions = context . plugin < MyPlugin > (). querySingle { sampleExtensionPoint } fun invoke () { allSampleExtensions . forEach { it . doSomething ( Input ()) } singleSampleExtensions . doSomething ( Input ()) } } In order to have access to DokkaContext , you can use the providing keyword when registering an extension.","title":"Obtaining extension instance"},{"location":"developer_guide/architecture/extension_points/generation_implementations/","text":"Generation implementations \u00b6 There are two main implementations of the Generation core extension point: SingleModuleGeneration - generates documentation for a single module, for instance when dokkaHtml task is invoked AllModulesPageGeneration - generates multi-module documentation, for instance when dokkaHtmlMultiModule task is invoked. SingleModuleGeneration \u00b6 SingleModuleGeneration is at the heart of generating documentation. It utilizes core and base extensions to build the documentation from start to finish. Below you can see the flow of how Dokka's data model is transformed by various core and base extensions. flowchart TD Input -- SourceToDocumentableTranslator --> doc1[Documentables] subgraph documentables [ ] doc1 -- PreMergeDocumentableTransformer --> doc2[Documentables] doc2 -- DocumentableMerger --> doc3[Documentables] doc3 -- DocumentableTransformer --> doc4[Documentables] end doc4 -- DocumentableToPageTranslator --> page1[Pages] subgraph ide2 [ ] page1 -- PageTransformer --> page2[Pages] end page2 -- Renderer --> Output You can read about what each stage does in Core extension points and Base plugin . AllModulesPageGeneration \u00b6 AllModulesPageGeneration utilizes the output generated by SingleModuleGeneration . Under the hood, it just collects all of the pages generated for individual modules, and assembles it all together, creating navigation links between the modules and so on.","title":"Generation implementations"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#generation-implementations","text":"There are two main implementations of the Generation core extension point: SingleModuleGeneration - generates documentation for a single module, for instance when dokkaHtml task is invoked AllModulesPageGeneration - generates multi-module documentation, for instance when dokkaHtmlMultiModule task is invoked.","title":"Generation implementations"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#singlemodulegeneration","text":"SingleModuleGeneration is at the heart of generating documentation. It utilizes core and base extensions to build the documentation from start to finish. Below you can see the flow of how Dokka's data model is transformed by various core and base extensions. flowchart TD Input -- SourceToDocumentableTranslator --> doc1[Documentables] subgraph documentables [ ] doc1 -- PreMergeDocumentableTransformer --> doc2[Documentables] doc2 -- DocumentableMerger --> doc3[Documentables] doc3 -- DocumentableTransformer --> doc4[Documentables] end doc4 -- DocumentableToPageTranslator --> page1[Pages] subgraph ide2 [ ] page1 -- PageTransformer --> page2[Pages] end page2 -- Renderer --> Output You can read about what each stage does in Core extension points and Base plugin .","title":"SingleModuleGeneration"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#allmodulespagegeneration","text":"AllModulesPageGeneration utilizes the output generated by SingleModuleGeneration . Under the hood, it just collects all of the pages generated for individual modules, and assembles it all together, creating navigation links between the modules and so on.","title":"AllModulesPageGeneration"},{"location":"developer_guide/community/slack/","text":"Slack channel \u00b6 Dokka has a dedicated #dokka channel in the Kotlin Community Slack , where you can ask questions and chat about using, customizing or contributing to Dokka. Follow the instructions to get an invite or connect directly .","title":"Slack"},{"location":"developer_guide/community/slack/#slack-channel","text":"Dokka has a dedicated #dokka channel in the Kotlin Community Slack , where you can ask questions and chat about using, customizing or contributing to Dokka. Follow the instructions to get an invite or connect directly .","title":"Slack channel"},{"location":"developer_guide/plugin-development/introduction/","text":"Introduction to plugin development \u00b6 Dokka was built from the ground up to be easily extensible and highly customizable, which allows the community to implement plugins for missing or very specific features that are not provided out of the box. Dokka plugins range anywhere from supporting other programming language sources to exotic output formats. You can add support for your own KDoc tags or annotations, teach Dokka how to render different DSLs that are found in KDoc descriptions, visually redesign Dokka's pages to be seamlessly integrated into your company's website, integrate it with other tools and so much more. In order to have an easier time developing plugins, it's a good idea to go through Dokka's internals first, to learn more about its data model and extensions . Setup \u00b6 Template \u00b6 The easiest way to start is to use the convenient Dokka plugin template . It has pre-configured dependencies, publishing and signing of your artifacts. Manual \u00b6 At a bare minimum, a Dokka plugin requires dokka-core as a dependency: import org.jetbrains.kotlin.gradle.dsl.JvmTarget import org.jetbrains.kotlin.gradle.tasks.KotlinCompile plugins { kotlin ( \"jvm\" ) version \"\" } dependencies { compileOnly ( \"org.jetbrains.dokka:dokka-core:\" ) } tasks . withType < KotlinCompile > (). configureEach { compilerOptions . jvmTarget . set ( JvmTarget . JVM_1_8 ) } In order to load a plugin into Dokka, your class must extend the DokkaPlugin class. A fully qualified name of that class must be placed in a file named org.jetbrains.dokka.plugability.DokkaPlugin under resources/META-INF/services . All instances are automatically loaded during Dokka's configuration step using java.util.ServiceLoader . Extension points \u00b6 Dokka provides a set of entry points for which you can create your own implementations. If you are not sure which extension points to use, have a look at core extensions and base extensions . You can learn how to declare extension points and extensions in Introduction to Extension points . In case no suitable extension point exists for your use case, do share the use case with the maintainers \u2014 it might be added in a future version of Dokka. Example \u00b6 You can follow the sample plugin tutorial , which covers the creation of a simple plugin that hides members annotated with your own @Internal annotation: that is, it excludes these members from the generated documentation. For more practical examples, have a look at sources of community plugins . Help \u00b6 If you have any further questions, feel free to get in touch with Dokka's maintainers via Slack or GitHub .","title":"Plugin development"},{"location":"developer_guide/plugin-development/introduction/#introduction-to-plugin-development","text":"Dokka was built from the ground up to be easily extensible and highly customizable, which allows the community to implement plugins for missing or very specific features that are not provided out of the box. Dokka plugins range anywhere from supporting other programming language sources to exotic output formats. You can add support for your own KDoc tags or annotations, teach Dokka how to render different DSLs that are found in KDoc descriptions, visually redesign Dokka's pages to be seamlessly integrated into your company's website, integrate it with other tools and so much more. In order to have an easier time developing plugins, it's a good idea to go through Dokka's internals first, to learn more about its data model and extensions .","title":"Introduction to plugin development"},{"location":"developer_guide/plugin-development/introduction/#setup","text":"","title":"Setup"},{"location":"developer_guide/plugin-development/introduction/#template","text":"The easiest way to start is to use the convenient Dokka plugin template . It has pre-configured dependencies, publishing and signing of your artifacts.","title":"Template"},{"location":"developer_guide/plugin-development/introduction/#manual","text":"At a bare minimum, a Dokka plugin requires dokka-core as a dependency: import org.jetbrains.kotlin.gradle.dsl.JvmTarget import org.jetbrains.kotlin.gradle.tasks.KotlinCompile plugins { kotlin ( \"jvm\" ) version \"\" } dependencies { compileOnly ( \"org.jetbrains.dokka:dokka-core:\" ) } tasks . withType < KotlinCompile > (). configureEach { compilerOptions . jvmTarget . set ( JvmTarget . JVM_1_8 ) } In order to load a plugin into Dokka, your class must extend the DokkaPlugin class. A fully qualified name of that class must be placed in a file named org.jetbrains.dokka.plugability.DokkaPlugin under resources/META-INF/services . All instances are automatically loaded during Dokka's configuration step using java.util.ServiceLoader .","title":"Manual"},{"location":"developer_guide/plugin-development/introduction/#extension-points","text":"Dokka provides a set of entry points for which you can create your own implementations. If you are not sure which extension points to use, have a look at core extensions and base extensions . You can learn how to declare extension points and extensions in Introduction to Extension points . In case no suitable extension point exists for your use case, do share the use case with the maintainers \u2014 it might be added in a future version of Dokka.","title":"Extension points"},{"location":"developer_guide/plugin-development/introduction/#example","text":"You can follow the sample plugin tutorial , which covers the creation of a simple plugin that hides members annotated with your own @Internal annotation: that is, it excludes these members from the generated documentation. For more practical examples, have a look at sources of community plugins .","title":"Example"},{"location":"developer_guide/plugin-development/introduction/#help","text":"If you have any further questions, feel free to get in touch with Dokka's maintainers via Slack or GitHub .","title":"Help"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/","text":"Sample plugin tutorial \u00b6 We'll go over creating a simple plugin that covers a very common use case: generate documentation for everything except for members annotated with a custom @Internal annotation - they should be hidden. The plugin will be tested with the following code: package org.jetbrains.dokka.internal.test annotation class Internal fun shouldBeVisible () {} @Internal fun shouldBeExcludedFromDocumentation () {} Expected behavior: function shouldBeExcludedFromDocumentation should not be visible in generated documentation. Full source code of this tutorial can be found in Dokka's examples under hide-internal-api . Preparing the project \u00b6 We'll begin by using Dokka plugin template . Press the Use this template button and open this project in IntelliJ IDEA . First, let's rename the pre-made template package and MyAwesomeDokkaPlugin class to something of our own. For instance, package can be renamed to org.example.dokka.plugin and the class to HideInternalApiPlugin : package org.example.dokka.plugin import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { } After you do that, make sure to update the path to this class in resources/META-INF/services/org.jetbrains.dokka.plugability.DokkaPlugin : org . example . dokka . plugin . HideInternalApiPlugin At this point you can also change project name in settings.gradle.kts (to hide-internal-api in our case) and groupId in build.gradle.kts . Extending Dokka \u00b6 After preparing the project we can begin extending Dokka with our own extension. Having read through Core extensions , it's clear that we need a PreMergeDocumentableTransformer extension in order to filter out undesired documentables. Moreover, the article mentioned a convenient abstract transformer SuppressedByConditionDocumentableFilterTransformer which is perfect for our use case, so we can try to implement it. Create a new class, place it next to your plugin and implement the abstract method. You should end up with this: package org.example.dokka.plugin import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () {} class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { return false } } Now we somehow need to find all annotations applied to d: Documentable and see if our @Internal annotation is present. However, it's not very clear how to do that. What usually helps is stopping in debugger and having a look at what fields and values a given Documentable has. To do that, we'll need to register our extension point first, then we can publish our plugin and set the breakpoint. Having read through Introduction to extensions , we now know how to register our extensions: class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } At this point we're ready to debug our plugin locally, it should already work, but do nothing. Debugging \u00b6 Please read through Debugging Dokka , it goes over the same steps in more detail and with examples. Below you will find rough instructions. First, let's begin by publishing our plugin to mavenLocal() . ./gradlew publishToMavenLocal This will publish your plugin under the groupId , artifactId and version that you've specified in your build.gradle.kts . In our case it's org.example:hide-internal-api:1.0-SNAPSHOT . Open a debug project of your choosing that has Dokka configured, and add our plugin to dependencies: dependencies { dokkaPlugin ( \"org.example:hide-internal-api:1.0-SNAPSHOT\" ) } Next, in that project let's run dokkaHtml with debug enabled: ./gradlew clean dokkaHtml -Dorg.gradle.debug = true --no-daemon Switch to the plugin project, set a breakpoint inside shouldBeSuppressed and run jvm remote debug. If you've done everything correctly, it should stop in debugger and you should be able to observe the values contained inside d: Documentable . Implementing plugin logic \u00b6 Now that we've stopped at our breakpoint, let's skip until we see shouldBeExcludedFromDocumentation function in the place of d: Documentable (observe the changing name property). Looking at what's inside the object, you might notice it has 3 values in extra , one of which is Annotations . Sounds like something we need! Having poked around, we come up with the following monstrosity of a code for determining if a given documentable has @Internal annotation (it can of course be refactored.. later): override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } Seems like we're done with writing our plugin and can begin testing it manually. Manual testing \u00b6 At this point, the implementation of your plugin should look roughly like this: package org.example.dokka.plugin import org.jetbrains.dokka.base.DokkaBase import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Annotations import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.model.properties.WithExtraProperties import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } } Bump plugin version in gradle.build.kts , publish it to maven local, open the debug project and run dokkaHtml (without debug this time). It should work, you should not be able to see shouldBeExcludedFromDocumentation function in generated documentation. Manual testing is cool and all, but wouldn't it be better if we could somehow write unit tests for it? Indeed! Unit testing \u00b6 You might've noticed that plugin template comes with a pre-made test class. Feel free to move it to another package and rename it. We are mostly interested in a single test case - functions annotated with @Internal should be hidden, while all other public functions should be visible. Plugin API comes with a set of convenient test utilities that are used to test Dokka itself, so it covers a wide range of use cases. When in doubt, see Dokka's tests for reference. Below you will find a complete unit test that passes, and the main takeaways below that. package org.example.dokka.plugin import org.jetbrains.dokka.base.testApi.testRunner.BaseAbstractTest import kotlin.test.Test import kotlin.test.assertEquals class HideInternalApiPluginTest : BaseAbstractTest () { @Test fun `should hide annotated functions` () { val configuration = dokkaConfiguration { sourceSets { sourceSet { sourceRoots = listOf ( \"src/main/kotlin/basic/Test.kt\" ) } } } val hideInternalPlugin = HideInternalApiPlugin () testInline ( \"\"\" |/src/main/kotlin/basic/Test.kt |package org.jetbrains.dokka.internal.test | |annotation class Internal | |fun shouldBeVisible() {} | |@Internal |fun shouldBeExcludedFromDocumentation() {} \"\"\" . trimMargin (), configuration = configuration , pluginOverrides = listOf ( hideInternalPlugin ) ) { preMergeDocumentablesTransformationStage = { modules -> val testModule = modules . single { it . name == \"root\" } val testPackage = testModule . packages . single { it . name == \"org.jetbrains.dokka.internal.test\" } val packageFunctions = testPackage . functions assertEquals ( 1 , packageFunctions . size ) assertEquals ( \"shouldBeVisible\" , packageFunctions [ 0 ] . name ) } } } } Note that the package of the tested code (inside testInline function) is the same as the package that we have hardcoded in our plugin. Make sure to change that to your own if you are following along, otherwise it will fail. Things to note and remember: Your test class should extend BaseAbstractTest , which contains base utility methods for testing. You can configure Dokka to your liking, enable some specific settings, configure source sets , etc. All done via dokkaConfiguration DSL. testInline function is the main entry point for unit tests You can pass plugins to be used in a test, notice pluginOverrides parameter You can write asserts for different stages of generating documentation, the main ones being Documentables model generation, Pages generation and Output generation. Since we implemented our plugin to work during PreMergeDocumentableTransformer stage, we can test it on the same level (that is preMergeDocumentablesTransformationStage ). You will need to write asserts using the model of whatever stage you choose. For Documentable transformation stage it's Documentable , for Page generation stage you would have Page model, and for Output you can have .html files that you will need to parse with JSoup (there are also utilities for that). Full source code of this tutorial can be found in Dokka's examples under hide-internal-api .","title":"Sample plugin tutorial"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#sample-plugin-tutorial","text":"We'll go over creating a simple plugin that covers a very common use case: generate documentation for everything except for members annotated with a custom @Internal annotation - they should be hidden. The plugin will be tested with the following code: package org.jetbrains.dokka.internal.test annotation class Internal fun shouldBeVisible () {} @Internal fun shouldBeExcludedFromDocumentation () {} Expected behavior: function shouldBeExcludedFromDocumentation should not be visible in generated documentation. Full source code of this tutorial can be found in Dokka's examples under hide-internal-api .","title":"Sample plugin tutorial"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#preparing-the-project","text":"We'll begin by using Dokka plugin template . Press the Use this template button and open this project in IntelliJ IDEA . First, let's rename the pre-made template package and MyAwesomeDokkaPlugin class to something of our own. For instance, package can be renamed to org.example.dokka.plugin and the class to HideInternalApiPlugin : package org.example.dokka.plugin import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { } After you do that, make sure to update the path to this class in resources/META-INF/services/org.jetbrains.dokka.plugability.DokkaPlugin : org . example . dokka . plugin . HideInternalApiPlugin At this point you can also change project name in settings.gradle.kts (to hide-internal-api in our case) and groupId in build.gradle.kts .","title":"Preparing the project"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#extending-dokka","text":"After preparing the project we can begin extending Dokka with our own extension. Having read through Core extensions , it's clear that we need a PreMergeDocumentableTransformer extension in order to filter out undesired documentables. Moreover, the article mentioned a convenient abstract transformer SuppressedByConditionDocumentableFilterTransformer which is perfect for our use case, so we can try to implement it. Create a new class, place it next to your plugin and implement the abstract method. You should end up with this: package org.example.dokka.plugin import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () {} class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { return false } } Now we somehow need to find all annotations applied to d: Documentable and see if our @Internal annotation is present. However, it's not very clear how to do that. What usually helps is stopping in debugger and having a look at what fields and values a given Documentable has. To do that, we'll need to register our extension point first, then we can publish our plugin and set the breakpoint. Having read through Introduction to extensions , we now know how to register our extensions: class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } At this point we're ready to debug our plugin locally, it should already work, but do nothing.","title":"Extending Dokka"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#debugging","text":"Please read through Debugging Dokka , it goes over the same steps in more detail and with examples. Below you will find rough instructions. First, let's begin by publishing our plugin to mavenLocal() . ./gradlew publishToMavenLocal This will publish your plugin under the groupId , artifactId and version that you've specified in your build.gradle.kts . In our case it's org.example:hide-internal-api:1.0-SNAPSHOT . Open a debug project of your choosing that has Dokka configured, and add our plugin to dependencies: dependencies { dokkaPlugin ( \"org.example:hide-internal-api:1.0-SNAPSHOT\" ) } Next, in that project let's run dokkaHtml with debug enabled: ./gradlew clean dokkaHtml -Dorg.gradle.debug = true --no-daemon Switch to the plugin project, set a breakpoint inside shouldBeSuppressed and run jvm remote debug. If you've done everything correctly, it should stop in debugger and you should be able to observe the values contained inside d: Documentable .","title":"Debugging"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#implementing-plugin-logic","text":"Now that we've stopped at our breakpoint, let's skip until we see shouldBeExcludedFromDocumentation function in the place of d: Documentable (observe the changing name property). Looking at what's inside the object, you might notice it has 3 values in extra , one of which is Annotations . Sounds like something we need! Having poked around, we come up with the following monstrosity of a code for determining if a given documentable has @Internal annotation (it can of course be refactored.. later): override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } Seems like we're done with writing our plugin and can begin testing it manually.","title":"Implementing plugin logic"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#manual-testing","text":"At this point, the implementation of your plugin should look roughly like this: package org.example.dokka.plugin import org.jetbrains.dokka.base.DokkaBase import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Annotations import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.model.properties.WithExtraProperties import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } } Bump plugin version in gradle.build.kts , publish it to maven local, open the debug project and run dokkaHtml (without debug this time). It should work, you should not be able to see shouldBeExcludedFromDocumentation function in generated documentation. Manual testing is cool and all, but wouldn't it be better if we could somehow write unit tests for it? Indeed!","title":"Manual testing"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#unit-testing","text":"You might've noticed that plugin template comes with a pre-made test class. Feel free to move it to another package and rename it. We are mostly interested in a single test case - functions annotated with @Internal should be hidden, while all other public functions should be visible. Plugin API comes with a set of convenient test utilities that are used to test Dokka itself, so it covers a wide range of use cases. When in doubt, see Dokka's tests for reference. Below you will find a complete unit test that passes, and the main takeaways below that. package org.example.dokka.plugin import org.jetbrains.dokka.base.testApi.testRunner.BaseAbstractTest import kotlin.test.Test import kotlin.test.assertEquals class HideInternalApiPluginTest : BaseAbstractTest () { @Test fun `should hide annotated functions` () { val configuration = dokkaConfiguration { sourceSets { sourceSet { sourceRoots = listOf ( \"src/main/kotlin/basic/Test.kt\" ) } } } val hideInternalPlugin = HideInternalApiPlugin () testInline ( \"\"\" |/src/main/kotlin/basic/Test.kt |package org.jetbrains.dokka.internal.test | |annotation class Internal | |fun shouldBeVisible() {} | |@Internal |fun shouldBeExcludedFromDocumentation() {} \"\"\" . trimMargin (), configuration = configuration , pluginOverrides = listOf ( hideInternalPlugin ) ) { preMergeDocumentablesTransformationStage = { modules -> val testModule = modules . single { it . name == \"root\" } val testPackage = testModule . packages . single { it . name == \"org.jetbrains.dokka.internal.test\" } val packageFunctions = testPackage . functions assertEquals ( 1 , packageFunctions . size ) assertEquals ( \"shouldBeVisible\" , packageFunctions [ 0 ] . name ) } } } } Note that the package of the tested code (inside testInline function) is the same as the package that we have hardcoded in our plugin. Make sure to change that to your own if you are following along, otherwise it will fail. Things to note and remember: Your test class should extend BaseAbstractTest , which contains base utility methods for testing. You can configure Dokka to your liking, enable some specific settings, configure source sets , etc. All done via dokkaConfiguration DSL. testInline function is the main entry point for unit tests You can pass plugins to be used in a test, notice pluginOverrides parameter You can write asserts for different stages of generating documentation, the main ones being Documentables model generation, Pages generation and Output generation. Since we implemented our plugin to work during PreMergeDocumentableTransformer stage, we can test it on the same level (that is preMergeDocumentablesTransformationStage ). You will need to write asserts using the model of whatever stage you choose. For Documentable transformation stage it's Documentable , for Page generation stage you would have Page model, and for Output you can have .html files that you will need to parse with JSoup (there are also utilities for that). Full source code of this tutorial can be found in Dokka's examples under hide-internal-api .","title":"Unit testing"}]}
\ No newline at end of file
+{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Dokka \u00b6 Dokka is an API documentation engine for Kotlin. If you want to learn how to use Dokka, see documentation on kotlinlang.org . If you want to learn more about Dokka's internals and/or how to write Dokka plugins, see Developer guides .","title":"Dokka"},{"location":"#dokka","text":"Dokka is an API documentation engine for Kotlin. If you want to learn how to use Dokka, see documentation on kotlinlang.org . If you want to learn more about Dokka's internals and/or how to write Dokka plugins, see Developer guides .","title":"Dokka"},{"location":"developer_guide/introduction/","text":"Developer guides \u00b6 The purpose of the Developer guides documentation is to get you acquainted with Dokka's internals so that you can start developing your own plugins or contributing features and fixes to Dokka itself. If you want to start hacking on Dokka right away, the only thing you need to be aware of is the general workflow : it will teach you how to build, debug and test Dokka locally. CONTRIBUTING.md contains information that can be useful if you want to contribute to Dokka. If you want to get into plugin development quick, see Introduction to plugin development . If you have time to spare and want to know more about Dokka's internals, its architecture and capabilities, follow Architecture overview and subsequent sections inside Internals . Having read through all the developer guides, you'll have a pretty good understanding of Dokka and how to develop for it. If you have any questions, feel free to get in touch with maintainers via Slack or GitHub .","title":"Developer guides"},{"location":"developer_guide/introduction/#developer-guides","text":"The purpose of the Developer guides documentation is to get you acquainted with Dokka's internals so that you can start developing your own plugins or contributing features and fixes to Dokka itself. If you want to start hacking on Dokka right away, the only thing you need to be aware of is the general workflow : it will teach you how to build, debug and test Dokka locally. CONTRIBUTING.md contains information that can be useful if you want to contribute to Dokka. If you want to get into plugin development quick, see Introduction to plugin development . If you have time to spare and want to know more about Dokka's internals, its architecture and capabilities, follow Architecture overview and subsequent sections inside Internals . Having read through all the developer guides, you'll have a pretty good understanding of Dokka and how to develop for it. If you have any questions, feel free to get in touch with maintainers via Slack or GitHub .","title":"Developer guides"},{"location":"developer_guide/workflow/","text":"Workflow \u00b6 Whether you're contributing a feature/fix to Dokka itself or developing a Dokka plugin, there are 3 essential things you need to know how to do: How to build Dokka or a plugin How to use/test locally built Dokka in a project How to debug Dokka or a plugin in IntelliJ IDEA We'll go over each step individually in this section. Examples below will be specific to Gradle and Gradle\u2019s Kotlin DSL , but you can apply the same principles and run/test/debug with CLI/Maven runners and build configurations if you wish. Build Dokka \u00b6 Building Dokka is pretty straightforward, with one small caveat: when you run ./gradlew build , it will run integration tests as well, which might take some time and will consume a lot of RAM, so you would usually want to exclude integration tests when building locally. ./gradlew build -x integrationTest Unit tests which are run as part of build should not take much time, but you can also skip it with -x test . Troubleshooting build \u00b6 API check failed for project .. \u00b6 If you see a message like API check failed for project .. during the build phase, it indicates that the binary compatibility check has failed, meaning you've changed/added/removed some public API. If the change was intentional, run ./gradlew apiDump - it will re-generate .api files with signatures, and you should be able to build Dokka with no errors. These updated files need to be committed as well. Maintainers will review API changes thoroughly, so please make sure it's intentional and rational. Use / test locally built Dokka \u00b6 Having built Dokka locally, you can publish it to mavenLocal() . This will allow you to test your changes in another project as well as debug code remotely. Change dokka_version in gradle.properties to something that you will use later on as the dependency version. For instance, you can set it to something like 1.9.20-my-fix-SNAPSHOT . This version will be propagated to plugins that reside inside Dokka's project (such as mathjax , kotlin-as-java , etc). Publish it to Maven Local ( ./gradlew publishToMavenLocal ). Corresponding artifacts should appear in ~/.m2 In the project you want to generate documentation for or debug on, add maven local as a plugin/dependency repository: repositories { mavenLocal () } Update your Dokka dependency to the version you've just published: plugins { id ( \"org.jetbrains.dokka\" ) version \"1.9.20-my-fix-SNAPSHOT\" } After completing these steps, you should be able to build documentation using your own version of Dokka. Debugging Dokka \u00b6 Dokka is essentially a Gradle plugin, so you can debug it the same way you would any other Gradle plugin. Below you'll find instructions on how to debug Dokka's internal logic, but you can apply the same principles if you wish to debug a Dokka plugin. Choose a project to debug on, it needs to have some code for which documentation will be generated. Prefer using smaller projects that reproduce the exact problem or behaviour you want since the less code you have, the easier it will be to understand what's going on. You can use example projects found in dokka/examples/gradle , there's both simple single-module and more complex multi-module / multiplatform examples. For the debug project, set org.gradle.debug to true in one of the following ways: In your gradle.properties add org.gradle.debug=true When running Dokka tasks: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon Run the desired Dokka task with --no-daemon . Gradle should wait until you attach with debugger before proceeding with the task, so no need to hurry here. Example: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon . Open Dokka in IntelliJ IDEA, set a breakpoint and, using remote debug in IntelliJ IDEA, Attach to process running on the default port 5005. You can do that either by creating a Remote JVM Debug Run/Debug configuration or by attaching to the process via Run -> Attach to process Note The reason for --no-daemon is that Gradle daemons continue to exist even after the task has completed execution, so you might hang in debug or experience issues with port was already in use if you try to run it again. If you previously ran Dokka with daemons and you are already encountering problems with it, try killing gradle daemons. For instance, via pkill -f gradle.*daemon In case you need to debug some other part of the build - consult the official Gradle tutorials on Troubleshooting Builds .","title":"Workflow"},{"location":"developer_guide/workflow/#workflow","text":"Whether you're contributing a feature/fix to Dokka itself or developing a Dokka plugin, there are 3 essential things you need to know how to do: How to build Dokka or a plugin How to use/test locally built Dokka in a project How to debug Dokka or a plugin in IntelliJ IDEA We'll go over each step individually in this section. Examples below will be specific to Gradle and Gradle\u2019s Kotlin DSL , but you can apply the same principles and run/test/debug with CLI/Maven runners and build configurations if you wish.","title":"Workflow"},{"location":"developer_guide/workflow/#build-dokka","text":"Building Dokka is pretty straightforward, with one small caveat: when you run ./gradlew build , it will run integration tests as well, which might take some time and will consume a lot of RAM, so you would usually want to exclude integration tests when building locally. ./gradlew build -x integrationTest Unit tests which are run as part of build should not take much time, but you can also skip it with -x test .","title":"Build Dokka"},{"location":"developer_guide/workflow/#troubleshooting-build","text":"","title":"Troubleshooting build"},{"location":"developer_guide/workflow/#api-check-failed-for-project","text":"If you see a message like API check failed for project .. during the build phase, it indicates that the binary compatibility check has failed, meaning you've changed/added/removed some public API. If the change was intentional, run ./gradlew apiDump - it will re-generate .api files with signatures, and you should be able to build Dokka with no errors. These updated files need to be committed as well. Maintainers will review API changes thoroughly, so please make sure it's intentional and rational.","title":"API check failed for project .."},{"location":"developer_guide/workflow/#use-test-locally-built-dokka","text":"Having built Dokka locally, you can publish it to mavenLocal() . This will allow you to test your changes in another project as well as debug code remotely. Change dokka_version in gradle.properties to something that you will use later on as the dependency version. For instance, you can set it to something like 1.9.20-my-fix-SNAPSHOT . This version will be propagated to plugins that reside inside Dokka's project (such as mathjax , kotlin-as-java , etc). Publish it to Maven Local ( ./gradlew publishToMavenLocal ). Corresponding artifacts should appear in ~/.m2 In the project you want to generate documentation for or debug on, add maven local as a plugin/dependency repository: repositories { mavenLocal () } Update your Dokka dependency to the version you've just published: plugins { id ( \"org.jetbrains.dokka\" ) version \"1.9.20-my-fix-SNAPSHOT\" } After completing these steps, you should be able to build documentation using your own version of Dokka.","title":"Use / test locally built Dokka"},{"location":"developer_guide/workflow/#debugging-dokka","text":"Dokka is essentially a Gradle plugin, so you can debug it the same way you would any other Gradle plugin. Below you'll find instructions on how to debug Dokka's internal logic, but you can apply the same principles if you wish to debug a Dokka plugin. Choose a project to debug on, it needs to have some code for which documentation will be generated. Prefer using smaller projects that reproduce the exact problem or behaviour you want since the less code you have, the easier it will be to understand what's going on. You can use example projects found in dokka/examples/gradle , there's both simple single-module and more complex multi-module / multiplatform examples. For the debug project, set org.gradle.debug to true in one of the following ways: In your gradle.properties add org.gradle.debug=true When running Dokka tasks: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon Run the desired Dokka task with --no-daemon . Gradle should wait until you attach with debugger before proceeding with the task, so no need to hurry here. Example: ./gradlew dokkaHtml -Dorg.gradle.debug=true --no-daemon . Open Dokka in IntelliJ IDEA, set a breakpoint and, using remote debug in IntelliJ IDEA, Attach to process running on the default port 5005. You can do that either by creating a Remote JVM Debug Run/Debug configuration or by attaching to the process via Run -> Attach to process Note The reason for --no-daemon is that Gradle daemons continue to exist even after the task has completed execution, so you might hang in debug or experience issues with port was already in use if you try to run it again. If you previously ran Dokka with daemons and you are already encountering problems with it, try killing gradle daemons. For instance, via pkill -f gradle.*daemon In case you need to debug some other part of the build - consult the official Gradle tutorials on Troubleshooting Builds .","title":"Debugging Dokka"},{"location":"developer_guide/architecture/architecture_overview/","text":"Architecture overview \u00b6 Normally, you would think that a tool like Dokka simply parses some programming language sources and generates HTML pages for whatever it sees along the way, with little to no abstractions. That would be the simplest and the most straightforward way to implement an API documentation engine. However, it was clear that Dokka may need to generate documentation from various sources (not only Kotlin), that users might request additional output formats (like Markdown), that users might need additional features like supporting custom KDoc tags or rendering mermaid.js diagrams - all these things would require changing a lot of code inside Dokka itself if all solutions were hardcoded. For this reason, Dokka was built from the ground up to be easily extensible and customizable by adding several layers of abstractions to the data model, and by providing pluggable extension points, giving you the ability to introduce selective changes on a given level. Overview of data model \u00b6 Generating API documentation begins with input source files ( .kt , .java , etc) and ends with some output files ( .html / .md , etc). However, to allow for extensibility and customization, several input and output independent abstractions have been added to the data model. Below you can find the general pipeline of processing data gathered from sources and the explanation for each stage. flowchart TD Input --> Documentables --> Pages --> Output Input - generalization of sources, by default Kotlin / Java sources, but could be virtually anything Documentables - unified data model that represents any parsed sources as a tree, independent of the source language. Examples of a Documentable : class, function, package, property, etc Pages - universal model that represents output pages (e.g a function/property page) and the content it's composed of (lists, text, code blocks) that the users needs to see. Not to be confused with .html pages. Goes hand in hand with the so-called Content model . Output - specific output formats like HTML / Markdown / Javadoc and so on. This is a mapping of the pages/content model to a human-readable and visual representation. For instance: PageNode is mapped as .html file for the HTML format .md file for the Markdown format ContentList is mapped as - /
for the HTML format 1. / * for the Markdown format ContentCodeBlock is mapped as or with some CSS styles in the HTML format Text wrapped in triple backticks for the Markdown format You, as a Dokka developer or a plugin writer, can use extension points to introduce selective changes to the model on one particular level without altering everything else. For instance, if you wanted to make an annotation / function / class invisible in the final documentation, you would only need to modify the Documentables level by filtering undesirable declarations out. If you wanted to display all overloaded methods on the same page instead of on separate ones, you would only need to modify the Pages layer by merging multiple pages into one, and so on. For a deeper dive into Dokka's model with more examples and details, see sections about Documentables and Page/Content For an overview of existing extension points that let you transform Dokka's models, see Core extension points and Base extensions . Overview of extension points \u00b6 An extension point usually represents a pluggable interface that performs an action during one of the stages of generating documentation. An extension is, therefore, an implementation of the interface which is extending the extension point. You can create extension points, provide your own implementations (extensions) and configure them. All of this is possible with Dokka's plugin / extension point API. Here's a sneak peek of the DSL: // declare your own plugin class MyPlugin : DokkaPlugin () { // create an extension point for developers to use val signatureProvider by extensionPoint < SignatureProvider > () // provide a default implementation val defaultSignatureProvider by extending { signatureProvider with KotlinSignatureProvider () } // register our own extension in Dokka's Base plugin by overriding its default implementation val dokkaBasePlugin by lazy { plugin < DokkaBase > () } val multimoduleLocationProvider by extending { ( dokkaBasePlugin . locationProviderFactory providing MultimoduleLocationProvider :: Factory override dokkaBasePlugin . locationProvider ) } } class MyExtension ( val context : DokkaContext ) { // use an existing extension val signatureProvider : SignatureProvider = context . plugin < MyPlugin > (). querySingle { signatureProvider } fun doSomething () { signatureProvider . signature (..) } } interface SignatureProvider { fun signature ( documentable : Documentable ): List < ContentNode > } class KotlinSignatureProvider : SignatureProvider { override fun signature ( documentable : Documentable ): List < ContentNode > = listOf () } For a deeper dive into extensions and extension points, see Introduction to Extensions . For an overview of existing extension points, see Core extension points and Base extensions . Historical context \u00b6 This is a second iteration of Dokka that was built from scratch. If you want to learn more about why Dokka was redesigned this way, watch this great talk by Pawe\u0142 Marks: New Dokka - Designed for Fearless Creativity . The general principles and general architecture are the same, although it may be outdated in some areas, so please double-check.","title":"Architecture"},{"location":"developer_guide/architecture/architecture_overview/#architecture-overview","text":"Normally, you would think that a tool like Dokka simply parses some programming language sources and generates HTML pages for whatever it sees along the way, with little to no abstractions. That would be the simplest and the most straightforward way to implement an API documentation engine. However, it was clear that Dokka may need to generate documentation from various sources (not only Kotlin), that users might request additional output formats (like Markdown), that users might need additional features like supporting custom KDoc tags or rendering mermaid.js diagrams - all these things would require changing a lot of code inside Dokka itself if all solutions were hardcoded. For this reason, Dokka was built from the ground up to be easily extensible and customizable by adding several layers of abstractions to the data model, and by providing pluggable extension points, giving you the ability to introduce selective changes on a given level.","title":"Architecture overview"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-data-model","text":"Generating API documentation begins with input source files ( .kt , .java , etc) and ends with some output files ( .html / .md , etc). However, to allow for extensibility and customization, several input and output independent abstractions have been added to the data model. Below you can find the general pipeline of processing data gathered from sources and the explanation for each stage. flowchart TD Input --> Documentables --> Pages --> Output Input - generalization of sources, by default Kotlin / Java sources, but could be virtually anything Documentables - unified data model that represents any parsed sources as a tree, independent of the source language. Examples of a Documentable : class, function, package, property, etc Pages - universal model that represents output pages (e.g a function/property page) and the content it's composed of (lists, text, code blocks) that the users needs to see. Not to be confused with .html pages. Goes hand in hand with the so-called Content model . Output - specific output formats like HTML / Markdown / Javadoc and so on. This is a mapping of the pages/content model to a human-readable and visual representation. For instance: PageNode is mapped as .html file for the HTML format .md file for the Markdown format ContentList is mapped as
- /
for the HTML format 1. / * for the Markdown format ContentCodeBlock is mapped as or with some CSS styles in the HTML format Text wrapped in triple backticks for the Markdown format You, as a Dokka developer or a plugin writer, can use extension points to introduce selective changes to the model on one particular level without altering everything else. For instance, if you wanted to make an annotation / function / class invisible in the final documentation, you would only need to modify the Documentables level by filtering undesirable declarations out. If you wanted to display all overloaded methods on the same page instead of on separate ones, you would only need to modify the Pages layer by merging multiple pages into one, and so on. For a deeper dive into Dokka's model with more examples and details, see sections about Documentables and Page/Content For an overview of existing extension points that let you transform Dokka's models, see Core extension points and Base extensions .","title":"Overview of data model"},{"location":"developer_guide/architecture/architecture_overview/#overview-of-extension-points","text":"An extension point usually represents a pluggable interface that performs an action during one of the stages of generating documentation. An extension is, therefore, an implementation of the interface which is extending the extension point. You can create extension points, provide your own implementations (extensions) and configure them. All of this is possible with Dokka's plugin / extension point API. Here's a sneak peek of the DSL: // declare your own plugin class MyPlugin : DokkaPlugin () { // create an extension point for developers to use val signatureProvider by extensionPoint < SignatureProvider > () // provide a default implementation val defaultSignatureProvider by extending { signatureProvider with KotlinSignatureProvider () } // register our own extension in Dokka's Base plugin by overriding its default implementation val dokkaBasePlugin by lazy { plugin < DokkaBase > () } val multimoduleLocationProvider by extending { ( dokkaBasePlugin . locationProviderFactory providing MultimoduleLocationProvider :: Factory override dokkaBasePlugin . locationProvider ) } } class MyExtension ( val context : DokkaContext ) { // use an existing extension val signatureProvider : SignatureProvider = context . plugin < MyPlugin > (). querySingle { signatureProvider } fun doSomething () { signatureProvider . signature (..) } } interface SignatureProvider { fun signature ( documentable : Documentable ): List < ContentNode > } class KotlinSignatureProvider : SignatureProvider { override fun signature ( documentable : Documentable ): List < ContentNode > = listOf () } For a deeper dive into extensions and extension points, see Introduction to Extensions . For an overview of existing extension points, see Core extension points and Base extensions .","title":"Overview of extension points"},{"location":"developer_guide/architecture/architecture_overview/#historical-context","text":"This is a second iteration of Dokka that was built from scratch. If you want to learn more about why Dokka was redesigned this way, watch this great talk by Pawe\u0142 Marks: New Dokka - Designed for Fearless Creativity . The general principles and general architecture are the same, although it may be outdated in some areas, so please double-check.","title":"Historical context"},{"location":"developer_guide/architecture/data_model/documentable_model/","text":"Documentable Model \u00b6 The Documentable model represents the data that is parsed from some programming language sources. Think of this data as of something that could be seen or produced by a compiler frontend, it's not far off from the truth. By default, the documentables are created from: Descriptors (Kotlin's K1 compiler) Symbols (Kotlin's K2 compiler) PSI (Java's model). Code-wise, you can have a look at following classes: DefaultDescriptorToDocumentableTranslator - responsible for Kotlin -> Documentable mapping DefaultPsiToDocumentableTranslator - responsible for Java -> Documentable mapping Upon creation, the documentable model represents a collection of trees, each with DModule as root. Take some arbitrary Kotlin source code that is located within the same module: // Package 1 class Clazz ( val property : String ) { fun function ( parameter : String ) {} } fun topLevelFunction () {} // Package 2 enum class Enum { } val topLevelProperty : String This would be represented roughly as the following Documentable tree: flowchart TD DModule --> firstPackage[DPackage] firstPackage --> DClass firstPackage --> toplevelfunction[DFunction] DClass --> DProperty DClass --> DFunction DFunction --> DParameter DModule --> secondPackage[DPackage] secondPackage --> DEnum secondPackage --> secondPackageProperty[DProperty] At later stages of transformation, all trees are folded into one by DocumentableMerger . Documentable \u00b6 The main building block of the documentable model is the Documentable class. It is the base class for all more specific types. All implementations represent elements of source code with mostly self-explanatory names: DFunction , DPackage , DProperty , and so on. DClasslike is the base class for all class-like documentables, such as DClass , DEnum , DAnnotation and others. The contents of each documentable normally represent what you would see in the source code. For example, if you open DClass , you should find that it contains references to functions, properties, companion objects, constructors and so on. DEnum should have references to its entries, and DPackage can have references to both classlikes and top-level functions and properties (Kotlin-specific). Here's an example of a documentable: data class DClass ( val dri : DRI , val name : String , val constructors : List < DFunction > , val functions : List < DFunction > , val properties : List < DProperty > , val classlikes : List < DClasslike > , val sources : SourceSetDependent < DocumentableSource > , val visibility : SourceSetDependent < Visibility > , val companion : DObject?, val generics : List < DTypeParameter > , val supertypes : SourceSetDependent < List < TypeConstructorWithKind >> , val documentation : SourceSetDependent < DocumentationNode > , val expectPresentInSet : DokkaSourceSet?, val modifier : SourceSetDependent < Modifier > , val sourceSets : Set < DokkaSourceSet > , val isExpectActual : Boolean , val extra : PropertyContainer < DClass > = PropertyContainer . empty () ) : DClasslike (), WithAbstraction , WithCompanion , WithConstructors , WithGenerics , WithSupertypes , WithExtraProperties < DClass > There are three non-documentable classes that are important for this model: DRI SourceSetDependent ExtraProperty . DRI \u00b6 DRI stans for Dokka Resource Identifier - a unique value that identifies a specific Documentable . All references and relations between the documentables (other than direct ownership) are described using DRI . For example, DFunction with a parameter of type Foo only has Foo 's DRI , but not the actual reference to Foo 's Documentable object. Example \u00b6 For an example of how a DRI can look like, let's take the limitedParallelism function from kotlinx.coroutines : package kotlinx.coroutines import ... public abstract class MainCoroutineDispatcher : CoroutineDispatcher () { override fun limitedParallelism ( parallelism : Int ): CoroutineDispatcher { ... } } If we were to re-create the DRI of this function in code, it would look something like this: DRI ( packageName = \"kotlinx.coroutines\" , classNames = \"MainCoroutineDispatcher\" , callable = Callable ( name = \"limitedParallelism\" , receiver = null , params = listOf ( TypeConstructor ( fullyQualifiedName = \"kotlin.Int\" , params = emptyList () ) ) ), target = PointingToDeclaration , extra = null ) If you format it as String , it would look like this: kotlinx.coroutines/MainCoroutineDispatcher/limitedParallelism/#kotlin.Int/PointingToDeclaration/ SourceSetDependent \u00b6 SourceSetDependent helps handling multiplatform data by associating platform-specific data (declared with either expect or actual modifiers) with particular source sets . This comes in handy if the expect / actual declarations differ. For example, the default value for actual might differ from that declared in expect , or code comments written for expect might be different from what's written for actual . Under the hood, it's a typealias to a Map : typealias SourceSetDependent < T > = Map < DokkaSourceSet , T > ExtraProperty \u00b6 ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. This element is a bit more complex, so you can read more about how to use it in a separate section . Documentation model \u00b6 The Documentation model is used alongside documentables to store data obtained by parsing code comments (such as KDocs / Javadocs). DocTag \u00b6 DocTag describes a specific documentation syntax element. It's universal across language sources. For example, the DocTag B is the same for **bold** in Kotlin and bold in Java. However, some DocTag elements are specific to one language. There are many such examples for Java, because it allows HTML tags inside the Javadoc comments, some of which are simply not possible to reproduce with Markdown that KDocs use. DocTag elements can be deeply nested with other DocTag children elements. Examples: data class H1 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class H2 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strikethrough ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strong ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class CodeBlock ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : Code () TagWrapper \u00b6 TagWrapper describes the whole comment description or a specific comment tag. For example: @see / @author / @return . Since each such section may contain formatted text inside it, each TagWrapper has DocTag children. /** * @author **Ben Affleck* * @return nothing, except _sometimes_ it may throw an [Error] */ fun foo () {} DocumentationNode \u00b6 DocumentationNode acts as a container for multiple TagWrapper elements for a specific Documentable , usually used like this: data class DFunction ( ... val documentation : SourceSetDependent < DocumentationNode > , ... )","title":"Documentables"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentable-model","text":"The Documentable model represents the data that is parsed from some programming language sources. Think of this data as of something that could be seen or produced by a compiler frontend, it's not far off from the truth. By default, the documentables are created from: Descriptors (Kotlin's K1 compiler) Symbols (Kotlin's K2 compiler) PSI (Java's model). Code-wise, you can have a look at following classes: DefaultDescriptorToDocumentableTranslator - responsible for Kotlin -> Documentable mapping DefaultPsiToDocumentableTranslator - responsible for Java -> Documentable mapping Upon creation, the documentable model represents a collection of trees, each with DModule as root. Take some arbitrary Kotlin source code that is located within the same module: // Package 1 class Clazz ( val property : String ) { fun function ( parameter : String ) {} } fun topLevelFunction () {} // Package 2 enum class Enum { } val topLevelProperty : String This would be represented roughly as the following Documentable tree: flowchart TD DModule --> firstPackage[DPackage] firstPackage --> DClass firstPackage --> toplevelfunction[DFunction] DClass --> DProperty DClass --> DFunction DFunction --> DParameter DModule --> secondPackage[DPackage] secondPackage --> DEnum secondPackage --> secondPackageProperty[DProperty] At later stages of transformation, all trees are folded into one by DocumentableMerger .","title":"Documentable Model"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentable","text":"The main building block of the documentable model is the Documentable class. It is the base class for all more specific types. All implementations represent elements of source code with mostly self-explanatory names: DFunction , DPackage , DProperty , and so on. DClasslike is the base class for all class-like documentables, such as DClass , DEnum , DAnnotation and others. The contents of each documentable normally represent what you would see in the source code. For example, if you open DClass , you should find that it contains references to functions, properties, companion objects, constructors and so on. DEnum should have references to its entries, and DPackage can have references to both classlikes and top-level functions and properties (Kotlin-specific). Here's an example of a documentable: data class DClass ( val dri : DRI , val name : String , val constructors : List < DFunction > , val functions : List < DFunction > , val properties : List < DProperty > , val classlikes : List < DClasslike > , val sources : SourceSetDependent < DocumentableSource > , val visibility : SourceSetDependent < Visibility > , val companion : DObject?, val generics : List < DTypeParameter > , val supertypes : SourceSetDependent < List < TypeConstructorWithKind >> , val documentation : SourceSetDependent < DocumentationNode > , val expectPresentInSet : DokkaSourceSet?, val modifier : SourceSetDependent < Modifier > , val sourceSets : Set < DokkaSourceSet > , val isExpectActual : Boolean , val extra : PropertyContainer < DClass > = PropertyContainer . empty () ) : DClasslike (), WithAbstraction , WithCompanion , WithConstructors , WithGenerics , WithSupertypes , WithExtraProperties < DClass > There are three non-documentable classes that are important for this model: DRI SourceSetDependent ExtraProperty .","title":"Documentable"},{"location":"developer_guide/architecture/data_model/documentable_model/#dri","text":"DRI stans for Dokka Resource Identifier - a unique value that identifies a specific Documentable . All references and relations between the documentables (other than direct ownership) are described using DRI . For example, DFunction with a parameter of type Foo only has Foo 's DRI , but not the actual reference to Foo 's Documentable object.","title":"DRI"},{"location":"developer_guide/architecture/data_model/documentable_model/#example","text":"For an example of how a DRI can look like, let's take the limitedParallelism function from kotlinx.coroutines : package kotlinx.coroutines import ... public abstract class MainCoroutineDispatcher : CoroutineDispatcher () { override fun limitedParallelism ( parallelism : Int ): CoroutineDispatcher { ... } } If we were to re-create the DRI of this function in code, it would look something like this: DRI ( packageName = \"kotlinx.coroutines\" , classNames = \"MainCoroutineDispatcher\" , callable = Callable ( name = \"limitedParallelism\" , receiver = null , params = listOf ( TypeConstructor ( fullyQualifiedName = \"kotlin.Int\" , params = emptyList () ) ) ), target = PointingToDeclaration , extra = null ) If you format it as String , it would look like this: kotlinx.coroutines/MainCoroutineDispatcher/limitedParallelism/#kotlin.Int/PointingToDeclaration/","title":"Example"},{"location":"developer_guide/architecture/data_model/documentable_model/#sourcesetdependent","text":"SourceSetDependent helps handling multiplatform data by associating platform-specific data (declared with either expect or actual modifiers) with particular source sets . This comes in handy if the expect / actual declarations differ. For example, the default value for actual might differ from that declared in expect , or code comments written for expect might be different from what's written for actual . Under the hood, it's a typealias to a Map : typealias SourceSetDependent < T > = Map < DokkaSourceSet , T >","title":"SourceSetDependent"},{"location":"developer_guide/architecture/data_model/documentable_model/#extraproperty","text":"ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. This element is a bit more complex, so you can read more about how to use it in a separate section .","title":"ExtraProperty"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentation-model","text":"The Documentation model is used alongside documentables to store data obtained by parsing code comments (such as KDocs / Javadocs).","title":"Documentation model"},{"location":"developer_guide/architecture/data_model/documentable_model/#doctag","text":"DocTag describes a specific documentation syntax element. It's universal across language sources. For example, the DocTag B is the same for **bold** in Kotlin and bold in Java. However, some DocTag elements are specific to one language. There are many such examples for Java, because it allows HTML tags inside the Javadoc comments, some of which are simply not possible to reproduce with Markdown that KDocs use. DocTag elements can be deeply nested with other DocTag children elements. Examples: data class H1 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class H2 ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strikethrough ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class Strong ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : DocTag () data class CodeBlock ( override val children : List < DocTag > = emptyList (), override val params : Map < String , String > = emptyMap () ) : Code ()","title":"DocTag"},{"location":"developer_guide/architecture/data_model/documentable_model/#tagwrapper","text":"TagWrapper describes the whole comment description or a specific comment tag. For example: @see / @author / @return . Since each such section may contain formatted text inside it, each TagWrapper has DocTag children. /** * @author **Ben Affleck* * @return nothing, except _sometimes_ it may throw an [Error] */ fun foo () {}","title":"TagWrapper"},{"location":"developer_guide/architecture/data_model/documentable_model/#documentationnode","text":"DocumentationNode acts as a container for multiple TagWrapper elements for a specific Documentable , usually used like this: data class DFunction ( ... val documentation : SourceSetDependent < DocumentationNode > , ... )","title":"DocumentationNode"},{"location":"developer_guide/architecture/data_model/extra/","text":"Extra \u00b6 Introduction \u00b6 ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. ExtraProperty classes are available both in the Documentable and the Content models. To create a new extra, you need to implement the ExtraProperty interface. It is advised to use the following pattern when declaring new extras: data class CustomExtra ( [ any data relevant to your extra ] , [ any data relevant to your extra ] ): ExtraProperty < Documentable > { override val key : CustomExtra . Key < Documentable , *> = CustomExtra companion object : CustomExtra . Key < Documentable , CustomExtra > } Merge strategy (the mergeStrategyFor method) for extras is invoked during the merging of the documentables from different source sets , when the documentables being merged have their own Extra of the same type. PropertyContainer \u00b6 All extras for ContentNode and Documentable classes are stored in the PropertyContainer class instances. data class DFunction ( ... override val extra : PropertyContainer < DFunction > = PropertyContainer . empty () ... ) : WithExtraProperties < DFunction > PropertyContainer has a number of convenient functions for handling extras in a collection-like manner. The generic class parameter C limits the types of properties that can be stored in the container - it must match the generic C class parameter from the ExtraProperty interface. This allows creating extra properties which can only be stored in a specific Documentable . Usage example \u00b6 In following example we will create a DFunction -only extra property, store it and then retrieve its value: // Extra that is applicable only to DFunction data class CustomExtra ( val customExtraValue : String ) : ExtraProperty < DFunction > { override val key : ExtraProperty . Key < Documentable , *> = CustomExtra companion object : ExtraProperty . Key < Documentable , CustomExtra > } // Storing it inside the documentable fun DFunction . withCustomExtraProperty ( data : String ): DFunction { return this . copy ( extra = extra + CustomExtra ( data ) ) } // Retrieveing it from the documentable fun DFunction . getCustomExtraPropertyValue (): String? { return this . extra [ CustomExtra ]?. customExtraValue } You can also use extras as markers, without storing any data in them: object MarkerExtra : ExtraProperty < Any > , ExtraProperty . Key < Any , MarkerExtra > { override val key : ExtraProperty . Key < Any , *> = this } fun Documentable . markIfFunction (): Documentable { return when ( this ) { is DFunction -> this . copy ( extra = extra + MarkerExtra ) else -> this } } fun WithExtraProperties < Documentable > . isMarked (): Boolean { return this . extra [ MarkerExtra ] != null }","title":"Extra properties"},{"location":"developer_guide/architecture/data_model/extra/#extra","text":"","title":"Extra"},{"location":"developer_guide/architecture/data_model/extra/#introduction","text":"ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. ExtraProperty classes are available both in the Documentable and the Content models. To create a new extra, you need to implement the ExtraProperty interface. It is advised to use the following pattern when declaring new extras: data class CustomExtra ( [ any data relevant to your extra ] , [ any data relevant to your extra ] ): ExtraProperty < Documentable > { override val key : CustomExtra . Key < Documentable , *> = CustomExtra companion object : CustomExtra . Key < Documentable , CustomExtra > } Merge strategy (the mergeStrategyFor method) for extras is invoked during the merging of the documentables from different source sets , when the documentables being merged have their own Extra of the same type.","title":"Introduction"},{"location":"developer_guide/architecture/data_model/extra/#propertycontainer","text":"All extras for ContentNode and Documentable classes are stored in the PropertyContainer class instances. data class DFunction ( ... override val extra : PropertyContainer < DFunction > = PropertyContainer . empty () ... ) : WithExtraProperties < DFunction > PropertyContainer has a number of convenient functions for handling extras in a collection-like manner. The generic class parameter C limits the types of properties that can be stored in the container - it must match the generic C class parameter from the ExtraProperty interface. This allows creating extra properties which can only be stored in a specific Documentable .","title":"PropertyContainer"},{"location":"developer_guide/architecture/data_model/extra/#usage-example","text":"In following example we will create a DFunction -only extra property, store it and then retrieve its value: // Extra that is applicable only to DFunction data class CustomExtra ( val customExtraValue : String ) : ExtraProperty < DFunction > { override val key : ExtraProperty . Key < Documentable , *> = CustomExtra companion object : ExtraProperty . Key < Documentable , CustomExtra > } // Storing it inside the documentable fun DFunction . withCustomExtraProperty ( data : String ): DFunction { return this . copy ( extra = extra + CustomExtra ( data ) ) } // Retrieveing it from the documentable fun DFunction . getCustomExtraPropertyValue (): String? { return this . extra [ CustomExtra ]?. customExtraValue } You can also use extras as markers, without storing any data in them: object MarkerExtra : ExtraProperty < Any > , ExtraProperty . Key < Any , MarkerExtra > { override val key : ExtraProperty . Key < Any , *> = this } fun Documentable . markIfFunction (): Documentable { return when ( this ) { is DFunction -> this . copy ( extra = extra + MarkerExtra ) else -> this } } fun WithExtraProperties < Documentable > . isMarked (): Boolean { return this . extra [ MarkerExtra ] != null }","title":"Usage example"},{"location":"developer_guide/architecture/data_model/page_content/","text":"Page / Content Model \u00b6 Even though the Page and Content models reside on the same level (under Page ), it is easier to view them as two different models altogether, even though Content is only used in conjunction with and inside the Page model only. Page \u00b6 The Page model represents the structure of documentation pages to be generated. During rendering, each page is processed separately, so one page corresponds to exactly one output file. The Page model is independent of the final output format. In other words, it's universal. Which file extension the pages should be created as ( .html , .md , etc), and how, is up to the Renderer extension. Subclasses of the PageNode class represent the different kinds of pages, such as ModulePage , PackagePage , ClasslikePage , MemberPage and so on. The Page model can be represented as a tree, with RootPageNode at the root. Here's an example of how an arbitrary project's Page tree might look like, if the project consists of a module with 3 packages, one of which contains a top level function, a top level property and a class, inside which there's a function and a property: flowchart TD RootPageNode --> firstPackage[PackagePageNode] RootPageNode --> secondPackage[PackagePageNode] RootPageNode --> thirdPackage[PackagePageNode] firstPackage --> firstPackageFirstMember[MemberPageNode - Function] firstPackage --> firstPackageSecondMember[MemberPageNode - Property] firstPackage ---> firstPackageClasslike[ClasslikePageNode - Class] firstPackageClasslike --> firstPackageClasslikeFirstMember[MemberPageNode - Function] firstPackageClasslike --> firstPackageClasslikeSecondMember[MemberPageNode - Property] secondPackage --> etcOne[...] thirdPackage --> etcTwo[...] Almost all pages are derivatives of ContentPage - it's the type of a page that has user-visible content on it. Content Model \u00b6 The Content model describes what the pages consist of. It is essentially a set of building blocks that you can put together to represent some content. It is also output-format independent and universal. For an example, have a look at the subclasses of ContentNode : ContentText , ContentList , ContentTable , ContentCodeBlock , ContentHeader and so on -- all self-explanatory. You can group chunks of content together with ContentGroup - for example, to wrap all children with a style. // real example of composing content using the `DocumentableContentBuilder` DSL orderedList { item { text ( \"This list contains a nested table:\" ) table { header { text ( \"Col1\" ) text ( \"Col2\" ) } row { text ( \"Text1\" ) text ( \"Text2\" ) } } } item { group ( styles = setOf ( TextStyle . Bold )) { text ( \"This is bald\" ) text ( \"This is also bald\" ) } } } It is the responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it, be it visually (html pages) or otherwise (json). For instance, HtmlRenderer might render ContentCodeBlock as text
, but CommonmarkRenderer might render it using backticks. DCI \u00b6 Each node is identified by a unique DCI , which stands for Dokka Content Identifier . DCI aggregates DRI s of all documentables that are used by the given ContentNode . data class DCI ( val dri : Set < DRI > , val kind : Kind ) All references to other nodes (other than direct ownership) are described using DCI . ContentKind \u00b6 ContentKind represents a grouping of content of one kind that can be rendered as part of a composite page, like a single one tab or a block within a class's page. For example, on the same page that describes a class you can have multiple sections (== ContentKind s). One to describe functions, one to describe properties, another one to describe the constructors, and so on. Styles \u00b6 Each ContentNode has a styles property in case you want to indicate to the Renderer that this content needs to be rendered in a certain way. group ( styles = setOf ( TextStyle . Paragraph )) { text ( \"Text1\" , styles = setOf ( TextStyle . Bold )) text ( \"Text2\" , styles = setOf ( TextStyle . Italic )) } It is responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it. For instance, HtmlRenderer might render TextStyle.Bold as text , but CommonmarkRenderer might render it as **text** . There's a number of existing styles that you can use, most of them are supported by the HtmlRenderer extension out of the box: // for code highlighting enum class TokenStyle : Style { Keyword , Punctuation , Function , Operator , Annotation , Number , String , Boolean , Constant , Builtin , ... } enum class TextStyle : Style { Bold , Italic , Strong , Strikethrough , Paragraph , ... } enum class ContentStyle : Style { TabbedContent , RunnableSample , Wrapped , Indented , ... } Extra \u00b6 ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. All ExtraProperty elements from the Documentable model are propagated into the Content model, and are available in the Renderer extensions. This element is a bit complex, so you can read more about how to use it in a separate section .","title":"Page & Content"},{"location":"developer_guide/architecture/data_model/page_content/#page-content-model","text":"Even though the Page and Content models reside on the same level (under Page ), it is easier to view them as two different models altogether, even though Content is only used in conjunction with and inside the Page model only.","title":"Page / Content Model"},{"location":"developer_guide/architecture/data_model/page_content/#page","text":"The Page model represents the structure of documentation pages to be generated. During rendering, each page is processed separately, so one page corresponds to exactly one output file. The Page model is independent of the final output format. In other words, it's universal. Which file extension the pages should be created as ( .html , .md , etc), and how, is up to the Renderer extension. Subclasses of the PageNode class represent the different kinds of pages, such as ModulePage , PackagePage , ClasslikePage , MemberPage and so on. The Page model can be represented as a tree, with RootPageNode at the root. Here's an example of how an arbitrary project's Page tree might look like, if the project consists of a module with 3 packages, one of which contains a top level function, a top level property and a class, inside which there's a function and a property: flowchart TD RootPageNode --> firstPackage[PackagePageNode] RootPageNode --> secondPackage[PackagePageNode] RootPageNode --> thirdPackage[PackagePageNode] firstPackage --> firstPackageFirstMember[MemberPageNode - Function] firstPackage --> firstPackageSecondMember[MemberPageNode - Property] firstPackage ---> firstPackageClasslike[ClasslikePageNode - Class] firstPackageClasslike --> firstPackageClasslikeFirstMember[MemberPageNode - Function] firstPackageClasslike --> firstPackageClasslikeSecondMember[MemberPageNode - Property] secondPackage --> etcOne[...] thirdPackage --> etcTwo[...] Almost all pages are derivatives of ContentPage - it's the type of a page that has user-visible content on it.","title":"Page"},{"location":"developer_guide/architecture/data_model/page_content/#content-model","text":"The Content model describes what the pages consist of. It is essentially a set of building blocks that you can put together to represent some content. It is also output-format independent and universal. For an example, have a look at the subclasses of ContentNode : ContentText , ContentList , ContentTable , ContentCodeBlock , ContentHeader and so on -- all self-explanatory. You can group chunks of content together with ContentGroup - for example, to wrap all children with a style. // real example of composing content using the `DocumentableContentBuilder` DSL orderedList { item { text ( \"This list contains a nested table:\" ) table { header { text ( \"Col1\" ) text ( \"Col2\" ) } row { text ( \"Text1\" ) text ( \"Text2\" ) } } } item { group ( styles = setOf ( TextStyle . Bold )) { text ( \"This is bald\" ) text ( \"This is also bald\" ) } } } It is the responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it, be it visually (html pages) or otherwise (json). For instance, HtmlRenderer might render ContentCodeBlock as text
, but CommonmarkRenderer might render it using backticks.","title":"Content Model"},{"location":"developer_guide/architecture/data_model/page_content/#dci","text":"Each node is identified by a unique DCI , which stands for Dokka Content Identifier . DCI aggregates DRI s of all documentables that are used by the given ContentNode . data class DCI ( val dri : Set < DRI > , val kind : Kind ) All references to other nodes (other than direct ownership) are described using DCI .","title":"DCI"},{"location":"developer_guide/architecture/data_model/page_content/#contentkind","text":"ContentKind represents a grouping of content of one kind that can be rendered as part of a composite page, like a single one tab or a block within a class's page. For example, on the same page that describes a class you can have multiple sections (== ContentKind s). One to describe functions, one to describe properties, another one to describe the constructors, and so on.","title":"ContentKind"},{"location":"developer_guide/architecture/data_model/page_content/#styles","text":"Each ContentNode has a styles property in case you want to indicate to the Renderer that this content needs to be rendered in a certain way. group ( styles = setOf ( TextStyle . Paragraph )) { text ( \"Text1\" , styles = setOf ( TextStyle . Bold )) text ( \"Text2\" , styles = setOf ( TextStyle . Italic )) } It is responsibility of the Renderer (i.e a specific output format) to render it in a way the user can process it. For instance, HtmlRenderer might render TextStyle.Bold as text , but CommonmarkRenderer might render it as **text** . There's a number of existing styles that you can use, most of them are supported by the HtmlRenderer extension out of the box: // for code highlighting enum class TokenStyle : Style { Keyword , Punctuation , Function , Operator , Annotation , Number , String , Boolean , Constant , Builtin , ... } enum class TextStyle : Style { Bold , Italic , Strong , Strikethrough , Paragraph , ... } enum class ContentStyle : Style { TabbedContent , RunnableSample , Wrapped , Indented , ... }","title":"Styles"},{"location":"developer_guide/architecture/data_model/page_content/#extra","text":"ExtraProperty is used to store any additional information that falls outside of the regular model. It is highly recommended to use extras to provide any additional information when creating custom Dokka plugins. All ExtraProperty elements from the Documentable model are propagated into the Content model, and are available in the Renderer extensions. This element is a bit complex, so you can read more about how to use it in a separate section .","title":"Extra"},{"location":"developer_guide/architecture/extension_points/base_plugin/","text":"Base plugin \u00b6 DokkaBase represents Dokka's Base plugin, which provides a number of sensible default implementations for CoreExtensions , as well as declares its own, more high-level abstractions and extension points to be used from other plugins and output formats. If you want to develop a simple plugin that only changes a few details, it is very convenient to rely on default implementations and use extension points defined in DokkaBase , as it reduces the scope of changes you need to make. DokkaBase is used extensively in Dokka's own output formats. You can learn how to add, use, override and configure extensions and extension points in Introduction to Extensions - all of that information is applicable to the DokkaBase plugin as well. Extension points \u00b6 Some notable extension points defined in Dokka's Base plugin. PreMergeDocumentableTransformer \u00b6 PreMergeDocumentableTransformer is very similar to the DocumentableTransformer core extension point, but it is used during an earlier stage by the Single module generation . This extension point allows you to apply any transformations to the Documentables model before the project's source sets are merged. It is useful if you want to filter/map existing documentables. For example, if you want to exclude members annotated with @Internal , you most likely need an implementation of PreMergeDocumentableTransformer . For simple condition-based filtering of documentables, consider extending SuppressedByConditionDocumentableFilterTransformer - it implements PreMergeDocumentableTransformer and only requires one function to be overridden, whereas the rest is taken care of.","title":"Base extensions"},{"location":"developer_guide/architecture/extension_points/base_plugin/#base-plugin","text":"DokkaBase represents Dokka's Base plugin, which provides a number of sensible default implementations for CoreExtensions , as well as declares its own, more high-level abstractions and extension points to be used from other plugins and output formats. If you want to develop a simple plugin that only changes a few details, it is very convenient to rely on default implementations and use extension points defined in DokkaBase , as it reduces the scope of changes you need to make. DokkaBase is used extensively in Dokka's own output formats. You can learn how to add, use, override and configure extensions and extension points in Introduction to Extensions - all of that information is applicable to the DokkaBase plugin as well.","title":"Base plugin"},{"location":"developer_guide/architecture/extension_points/base_plugin/#extension-points","text":"Some notable extension points defined in Dokka's Base plugin.","title":"Extension points"},{"location":"developer_guide/architecture/extension_points/base_plugin/#premergedocumentabletransformer","text":"PreMergeDocumentableTransformer is very similar to the DocumentableTransformer core extension point, but it is used during an earlier stage by the Single module generation . This extension point allows you to apply any transformations to the Documentables model before the project's source sets are merged. It is useful if you want to filter/map existing documentables. For example, if you want to exclude members annotated with @Internal , you most likely need an implementation of PreMergeDocumentableTransformer . For simple condition-based filtering of documentables, consider extending SuppressedByConditionDocumentableFilterTransformer - it implements PreMergeDocumentableTransformer and only requires one function to be overridden, whereas the rest is taken care of.","title":"PreMergeDocumentableTransformer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/","text":"Core extension points \u00b6 Core extension points represent the main stages of generating documentation. These extension points are plugin and output format independent, meaning it's the very core functionality and as low-level as can get in Dokka. For higher-level extension functions that can be used in different output formats, have a look at the Base plugin . You can find all core extensions in the CoreExtensions class: object CoreExtensions { val preGenerationCheck by coreExtensionPoint < PreGenerationChecker > () val generation by coreExtensionPoint < Generation > () val sourceToDocumentableTranslator by coreExtensionPoint < SourceToDocumentableTranslator > () val documentableMerger by coreExtensionPoint < DocumentableMerger > () val documentableTransformer by coreExtensionPoint < DocumentableTransformer > () val documentableToPageTranslator by coreExtensionPoint < DocumentableToPageTranslator > () val pageTransformer by coreExtensionPoint < PageTransformer > () val renderer by coreExtensionPoint < Renderer > () val postActions by coreExtensionPoint < PostAction > () } On this page, we'll go over each extension point individually. PreGenerationChecker \u00b6 PreGenerationChecker can be used to run some checks and constraints. For example, Dokka's Javadoc plugin does not support generating documentation for multi-platform projects, so it uses PreGenerationChecker to check for multi-platform source sets , and fails if it finds any. Generation \u00b6 Generation is responsible for generating documentation as a whole, utilizing higher-level extensions and extension points where applicable. See Generation implementations to learn about the default implementations. SourceToDocumentableTranslator \u00b6 SourceToDocumentableTranslator translates any given sources into the Documentable model. Kotlin and Java sources are supported by default by the Base plugin , but you can analyze any language as long as you can map it to the Documentable model. For reference, see DefaultDescriptorToDocumentableTranslator for Kotlin sources translation DefaultPsiToDocumentableTranslator for Java sources translation DocumentableMerger \u00b6 DocumentableMerger merges all DModule instances into one. Only one extension of this type is expected to be registered. DocumentableTransformer \u00b6 DocumentableTransformer performs the same function as PreMergeDocumentableTransformer , but after merging source sets. Notable example is InheritorsExtractorTransformer , it extracts inheritance information from source sets and creates an inheritance map. DocumentableToPageTranslator \u00b6 DocumentableToPageTranslator is responsible for creating pages and their content. See Page / Content model page for more information and examples. Output formats can either use the same page structure or define their own. Only a single extension of this type is expected to be registered. PageTransformer \u00b6 PageTransformer is useful if you need to add, remove or modify generated pages or their content. Using this extension point, plugins like org.jetbrains.dokka:mathjax-pligin can add .js scripts to the HTML pages. If you want all overloaded functions to be rendered on the same page instead of separate ones, you can use PageTransformer to combine the pages into a single one. Renderer \u00b6 Renderer - defines the rules on how to render pages and their content: which files to create and how to display the content properly. Custom output format plugins should use the Renderer extension point. Notable examples are HtmlRenderer and CommonmarkRenderer . PostAction \u00b6 PostAction can be used for when you want to run some actions after the documentation has been generated - for example, if you want to move some files around or log some informational messages. Dokka's Versioning plugin utilizes PostAction to move generated documentation to the versioned directories.","title":"Core extension points"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#core-extension-points","text":"Core extension points represent the main stages of generating documentation. These extension points are plugin and output format independent, meaning it's the very core functionality and as low-level as can get in Dokka. For higher-level extension functions that can be used in different output formats, have a look at the Base plugin . You can find all core extensions in the CoreExtensions class: object CoreExtensions { val preGenerationCheck by coreExtensionPoint < PreGenerationChecker > () val generation by coreExtensionPoint < Generation > () val sourceToDocumentableTranslator by coreExtensionPoint < SourceToDocumentableTranslator > () val documentableMerger by coreExtensionPoint < DocumentableMerger > () val documentableTransformer by coreExtensionPoint < DocumentableTransformer > () val documentableToPageTranslator by coreExtensionPoint < DocumentableToPageTranslator > () val pageTransformer by coreExtensionPoint < PageTransformer > () val renderer by coreExtensionPoint < Renderer > () val postActions by coreExtensionPoint < PostAction > () } On this page, we'll go over each extension point individually.","title":"Core extension points"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pregenerationchecker","text":"PreGenerationChecker can be used to run some checks and constraints. For example, Dokka's Javadoc plugin does not support generating documentation for multi-platform projects, so it uses PreGenerationChecker to check for multi-platform source sets , and fails if it finds any.","title":"PreGenerationChecker"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#generation","text":"Generation is responsible for generating documentation as a whole, utilizing higher-level extensions and extension points where applicable. See Generation implementations to learn about the default implementations.","title":"Generation"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#sourcetodocumentabletranslator","text":"SourceToDocumentableTranslator translates any given sources into the Documentable model. Kotlin and Java sources are supported by default by the Base plugin , but you can analyze any language as long as you can map it to the Documentable model. For reference, see DefaultDescriptorToDocumentableTranslator for Kotlin sources translation DefaultPsiToDocumentableTranslator for Java sources translation","title":"SourceToDocumentableTranslator"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentablemerger","text":"DocumentableMerger merges all DModule instances into one. Only one extension of this type is expected to be registered.","title":"DocumentableMerger"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletransformer","text":"DocumentableTransformer performs the same function as PreMergeDocumentableTransformer , but after merging source sets. Notable example is InheritorsExtractorTransformer , it extracts inheritance information from source sets and creates an inheritance map.","title":"DocumentableTransformer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#documentabletopagetranslator","text":"DocumentableToPageTranslator is responsible for creating pages and their content. See Page / Content model page for more information and examples. Output formats can either use the same page structure or define their own. Only a single extension of this type is expected to be registered.","title":"DocumentableToPageTranslator"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#pagetransformer","text":"PageTransformer is useful if you need to add, remove or modify generated pages or their content. Using this extension point, plugins like org.jetbrains.dokka:mathjax-pligin can add .js scripts to the HTML pages. If you want all overloaded functions to be rendered on the same page instead of separate ones, you can use PageTransformer to combine the pages into a single one.","title":"PageTransformer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#renderer","text":"Renderer - defines the rules on how to render pages and their content: which files to create and how to display the content properly. Custom output format plugins should use the Renderer extension point. Notable examples are HtmlRenderer and CommonmarkRenderer .","title":"Renderer"},{"location":"developer_guide/architecture/extension_points/core_extension_points/#postaction","text":"PostAction can be used for when you want to run some actions after the documentation has been generated - for example, if you want to move some files around or log some informational messages. Dokka's Versioning plugin utilizes PostAction to move generated documentation to the versioned directories.","title":"PostAction"},{"location":"developer_guide/architecture/extension_points/extension_points/","text":"Extension points \u00b6 In this section you can learn how to create new extension points, how to configure existing ones, and how to query for registered extensions when generating documentation. Declaring extension points \u00b6 If you are writing a plugin, you can create your own extension points that other developers (or you) can use in other plugins / parts of code. class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () } interface SampleExtensionPointInterface { fun doSomething ( input : Input ): List < Output > } class Input class Output Usually, you would want to provide some default implementations for your extension points. You can do that within the same plugin class by extending an extension point you've just created. See Extending from extension points for examples. Extending from extension points \u00b6 You can use extension points to provide your own implementations in order to customize a plugin's behaviour. If you want to provide an implementation for an extension point declared in an external plugin (including DokkaBase ), you can use plugin querying API to do that. The example below shows how to extend MyPlugin (that was created above) with an implementation of SampleExtensionPointInterface . class MyExtendedPlugin : DokkaPlugin () { val mySampleExtensionImplementation by extending { plugin < MyPlugin > (). sampleExtensionPoint with SampleExtensionImpl () } } class SampleExtensionImpl : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () } Alternatively, if it is your own plugin, you can do that within the same class as the extension point itself: open class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () val defaultSampleExtension by extending { sampleExtensionPoint with DefaultSampleExtension () } } class DefaultSampleExtension : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () } Providing \u00b6 If you need to have access to DokkaContext when creating an extension, you can use the providing keyword instead. val defaultSampleExtension by extending { sampleExtensionPoint providing { context -> // can use context to query other extensions or get configuration DefaultSampleExtension () } } You can read more on what you can do with context in Obtaining extension instance . Override \u00b6 By extending an extension point, you are registering an additional extension. This behaviour is expected by some extension points, for example the Documentable transformers, because all registered transformer extensions do their own transformations independently and one after the other. However, a plugin can expect only a single extension to be registered for an extension point. In this case, you can use the override keyword to override the existing registered extension: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { ( myPlugin . sampleExtensionPoint with SampleExtensionImpl () override myPlugin . defaultSampleExtension ) } } This is also useful if you wish to override some extension from DokkaBase , to disable or alter it. Order \u00b6 Sometimes, the order in which extensions are invoked matters. This is something you can control as well using the order construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () order { before ( myPlugin . firstExtension ) after ( myPlugin . thirdExtension ) } } } Conditional apply \u00b6 If you want your extension to be registered only if some condition is true , you can use the applyIf construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () applyIf { Random . Default . nextBoolean () } } } Obtaining extension instance \u00b6 After an extension point has been created and some extensions have been registered , you can use query and querySingle functions to find all or just a single implementation. class MyExtension ( context : DokkaContext ) { // returns all registered extensions for the extension point val allSampleExtensions = context . plugin < MyPlugin > (). query { sampleExtensionPoint } // will throw an exception if more than one extension is found. // use if you expect only a single extension to be registered for the extension point val singleSampleExtensions = context . plugin < MyPlugin > (). querySingle { sampleExtensionPoint } fun invoke () { allSampleExtensions . forEach { it . doSomething ( Input ()) } singleSampleExtensions . doSomething ( Input ()) } } In order to have access to DokkaContext , you can use the providing keyword when registering an extension.","title":"Extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#extension-points","text":"In this section you can learn how to create new extension points, how to configure existing ones, and how to query for registered extensions when generating documentation.","title":"Extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#declaring-extension-points","text":"If you are writing a plugin, you can create your own extension points that other developers (or you) can use in other plugins / parts of code. class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () } interface SampleExtensionPointInterface { fun doSomething ( input : Input ): List < Output > } class Input class Output Usually, you would want to provide some default implementations for your extension points. You can do that within the same plugin class by extending an extension point you've just created. See Extending from extension points for examples.","title":"Declaring extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#extending-from-extension-points","text":"You can use extension points to provide your own implementations in order to customize a plugin's behaviour. If you want to provide an implementation for an extension point declared in an external plugin (including DokkaBase ), you can use plugin querying API to do that. The example below shows how to extend MyPlugin (that was created above) with an implementation of SampleExtensionPointInterface . class MyExtendedPlugin : DokkaPlugin () { val mySampleExtensionImplementation by extending { plugin < MyPlugin > (). sampleExtensionPoint with SampleExtensionImpl () } } class SampleExtensionImpl : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () } Alternatively, if it is your own plugin, you can do that within the same class as the extension point itself: open class MyPlugin : DokkaPlugin () { val sampleExtensionPoint by extensionPoint < SampleExtensionPointInterface > () val defaultSampleExtension by extending { sampleExtensionPoint with DefaultSampleExtension () } } class DefaultSampleExtension : SampleExtensionPointInterface { override fun doSomething ( input : Input ): List < Output > = listOf () }","title":"Extending from extension points"},{"location":"developer_guide/architecture/extension_points/extension_points/#providing","text":"If you need to have access to DokkaContext when creating an extension, you can use the providing keyword instead. val defaultSampleExtension by extending { sampleExtensionPoint providing { context -> // can use context to query other extensions or get configuration DefaultSampleExtension () } } You can read more on what you can do with context in Obtaining extension instance .","title":"Providing"},{"location":"developer_guide/architecture/extension_points/extension_points/#override","text":"By extending an extension point, you are registering an additional extension. This behaviour is expected by some extension points, for example the Documentable transformers, because all registered transformer extensions do their own transformations independently and one after the other. However, a plugin can expect only a single extension to be registered for an extension point. In this case, you can use the override keyword to override the existing registered extension: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { ( myPlugin . sampleExtensionPoint with SampleExtensionImpl () override myPlugin . defaultSampleExtension ) } } This is also useful if you wish to override some extension from DokkaBase , to disable or alter it.","title":"Override"},{"location":"developer_guide/architecture/extension_points/extension_points/#order","text":"Sometimes, the order in which extensions are invoked matters. This is something you can control as well using the order construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () order { before ( myPlugin . firstExtension ) after ( myPlugin . thirdExtension ) } } }","title":"Order"},{"location":"developer_guide/architecture/extension_points/extension_points/#conditional-apply","text":"If you want your extension to be registered only if some condition is true , you can use the applyIf construct: class MyExtendedPlugin : DokkaPlugin () { private val myPlugin by lazy { plugin < MyPlugin > () } val mySampleExtensionImplementation by extending { myPlugin . sampleExtensionPoint with SampleExtensionImpl () applyIf { Random . Default . nextBoolean () } } }","title":"Conditional apply"},{"location":"developer_guide/architecture/extension_points/extension_points/#obtaining-extension-instance","text":"After an extension point has been created and some extensions have been registered , you can use query and querySingle functions to find all or just a single implementation. class MyExtension ( context : DokkaContext ) { // returns all registered extensions for the extension point val allSampleExtensions = context . plugin < MyPlugin > (). query { sampleExtensionPoint } // will throw an exception if more than one extension is found. // use if you expect only a single extension to be registered for the extension point val singleSampleExtensions = context . plugin < MyPlugin > (). querySingle { sampleExtensionPoint } fun invoke () { allSampleExtensions . forEach { it . doSomething ( Input ()) } singleSampleExtensions . doSomething ( Input ()) } } In order to have access to DokkaContext , you can use the providing keyword when registering an extension.","title":"Obtaining extension instance"},{"location":"developer_guide/architecture/extension_points/generation_implementations/","text":"Generation implementations \u00b6 There are two main implementations of the Generation core extension point: SingleModuleGeneration - generates documentation for a single module, for instance when dokkaHtml task is invoked AllModulesPageGeneration - generates multi-module documentation, for instance when dokkaHtmlMultiModule task is invoked. SingleModuleGeneration \u00b6 SingleModuleGeneration is at the heart of generating documentation. It utilizes core and base extensions to build the documentation from start to finish. Below you can see the flow of how Dokka's data model is transformed by various core and base extensions. flowchart TD Input -- SourceToDocumentableTranslator --> doc1[Documentables] subgraph documentables [ ] doc1 -- PreMergeDocumentableTransformer --> doc2[Documentables] doc2 -- DocumentableMerger --> doc3[Documentables] doc3 -- DocumentableTransformer --> doc4[Documentables] end doc4 -- DocumentableToPageTranslator --> page1[Pages] subgraph ide2 [ ] page1 -- PageTransformer --> page2[Pages] end page2 -- Renderer --> Output You can read about what each stage does in Core extension points and Base plugin . AllModulesPageGeneration \u00b6 AllModulesPageGeneration utilizes the output generated by SingleModuleGeneration . Under the hood, it just collects all of the pages generated for individual modules, and assembles it all together, creating navigation links between the modules and so on.","title":"Generation implementations"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#generation-implementations","text":"There are two main implementations of the Generation core extension point: SingleModuleGeneration - generates documentation for a single module, for instance when dokkaHtml task is invoked AllModulesPageGeneration - generates multi-module documentation, for instance when dokkaHtmlMultiModule task is invoked.","title":"Generation implementations"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#singlemodulegeneration","text":"SingleModuleGeneration is at the heart of generating documentation. It utilizes core and base extensions to build the documentation from start to finish. Below you can see the flow of how Dokka's data model is transformed by various core and base extensions. flowchart TD Input -- SourceToDocumentableTranslator --> doc1[Documentables] subgraph documentables [ ] doc1 -- PreMergeDocumentableTransformer --> doc2[Documentables] doc2 -- DocumentableMerger --> doc3[Documentables] doc3 -- DocumentableTransformer --> doc4[Documentables] end doc4 -- DocumentableToPageTranslator --> page1[Pages] subgraph ide2 [ ] page1 -- PageTransformer --> page2[Pages] end page2 -- Renderer --> Output You can read about what each stage does in Core extension points and Base plugin .","title":"SingleModuleGeneration"},{"location":"developer_guide/architecture/extension_points/generation_implementations/#allmodulespagegeneration","text":"AllModulesPageGeneration utilizes the output generated by SingleModuleGeneration . Under the hood, it just collects all of the pages generated for individual modules, and assembles it all together, creating navigation links between the modules and so on.","title":"AllModulesPageGeneration"},{"location":"developer_guide/community/slack/","text":"Slack channel \u00b6 Dokka has a dedicated #dokka channel in the Kotlin Community Slack , where you can ask questions and chat about using, customizing or contributing to Dokka. Follow the instructions to get an invite or connect directly .","title":"Slack"},{"location":"developer_guide/community/slack/#slack-channel","text":"Dokka has a dedicated #dokka channel in the Kotlin Community Slack , where you can ask questions and chat about using, customizing or contributing to Dokka. Follow the instructions to get an invite or connect directly .","title":"Slack channel"},{"location":"developer_guide/plugin-development/introduction/","text":"Introduction to plugin development \u00b6 Dokka was built from the ground up to be easily extensible and highly customizable, which allows the community to implement plugins for missing or very specific features that are not provided out of the box. Dokka plugins range anywhere from supporting other programming language sources to exotic output formats. You can add support for your own KDoc tags or annotations, teach Dokka how to render different DSLs that are found in KDoc descriptions, visually redesign Dokka's pages to be seamlessly integrated into your company's website, integrate it with other tools and so much more. In order to have an easier time developing plugins, it's a good idea to go through Dokka's internals first, to learn more about its data model and extensions . Setup \u00b6 Template \u00b6 The easiest way to start is to use the convenient Dokka plugin template . It has pre-configured dependencies, publishing and signing of your artifacts. Manual \u00b6 At a bare minimum, a Dokka plugin requires dokka-core as a dependency: import org.jetbrains.kotlin.gradle.dsl.JvmTarget import org.jetbrains.kotlin.gradle.tasks.KotlinCompile plugins { kotlin ( \"jvm\" ) version \"\" } dependencies { compileOnly ( \"org.jetbrains.dokka:dokka-core:\" ) } tasks . withType < KotlinCompile > (). configureEach { compilerOptions . jvmTarget . set ( JvmTarget . JVM_1_8 ) } In order to load a plugin into Dokka, your class must extend the DokkaPlugin class. A fully qualified name of that class must be placed in a file named org.jetbrains.dokka.plugability.DokkaPlugin under resources/META-INF/services . All instances are automatically loaded during Dokka's configuration step using java.util.ServiceLoader . Extension points \u00b6 Dokka provides a set of entry points for which you can create your own implementations. If you are not sure which extension points to use, have a look at core extensions and base extensions . You can learn how to declare extension points and extensions in Introduction to Extension points . In case no suitable extension point exists for your use case, do share the use case with the maintainers \u2014 it might be added in a future version of Dokka. Example \u00b6 You can follow the sample plugin tutorial , which covers the creation of a simple plugin that hides members annotated with your own @Internal annotation: that is, it excludes these members from the generated documentation. For more practical examples, have a look at sources of community plugins . Help \u00b6 If you have any further questions, feel free to get in touch with Dokka's maintainers via Slack or GitHub .","title":"Plugin development"},{"location":"developer_guide/plugin-development/introduction/#introduction-to-plugin-development","text":"Dokka was built from the ground up to be easily extensible and highly customizable, which allows the community to implement plugins for missing or very specific features that are not provided out of the box. Dokka plugins range anywhere from supporting other programming language sources to exotic output formats. You can add support for your own KDoc tags or annotations, teach Dokka how to render different DSLs that are found in KDoc descriptions, visually redesign Dokka's pages to be seamlessly integrated into your company's website, integrate it with other tools and so much more. In order to have an easier time developing plugins, it's a good idea to go through Dokka's internals first, to learn more about its data model and extensions .","title":"Introduction to plugin development"},{"location":"developer_guide/plugin-development/introduction/#setup","text":"","title":"Setup"},{"location":"developer_guide/plugin-development/introduction/#template","text":"The easiest way to start is to use the convenient Dokka plugin template . It has pre-configured dependencies, publishing and signing of your artifacts.","title":"Template"},{"location":"developer_guide/plugin-development/introduction/#manual","text":"At a bare minimum, a Dokka plugin requires dokka-core as a dependency: import org.jetbrains.kotlin.gradle.dsl.JvmTarget import org.jetbrains.kotlin.gradle.tasks.KotlinCompile plugins { kotlin ( \"jvm\" ) version \"\" } dependencies { compileOnly ( \"org.jetbrains.dokka:dokka-core:\" ) } tasks . withType < KotlinCompile > (). configureEach { compilerOptions . jvmTarget . set ( JvmTarget . JVM_1_8 ) } In order to load a plugin into Dokka, your class must extend the DokkaPlugin class. A fully qualified name of that class must be placed in a file named org.jetbrains.dokka.plugability.DokkaPlugin under resources/META-INF/services . All instances are automatically loaded during Dokka's configuration step using java.util.ServiceLoader .","title":"Manual"},{"location":"developer_guide/plugin-development/introduction/#extension-points","text":"Dokka provides a set of entry points for which you can create your own implementations. If you are not sure which extension points to use, have a look at core extensions and base extensions . You can learn how to declare extension points and extensions in Introduction to Extension points . In case no suitable extension point exists for your use case, do share the use case with the maintainers \u2014 it might be added in a future version of Dokka.","title":"Extension points"},{"location":"developer_guide/plugin-development/introduction/#example","text":"You can follow the sample plugin tutorial , which covers the creation of a simple plugin that hides members annotated with your own @Internal annotation: that is, it excludes these members from the generated documentation. For more practical examples, have a look at sources of community plugins .","title":"Example"},{"location":"developer_guide/plugin-development/introduction/#help","text":"If you have any further questions, feel free to get in touch with Dokka's maintainers via Slack or GitHub .","title":"Help"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/","text":"Sample plugin tutorial \u00b6 We'll go over creating a simple plugin that covers a very common use case: generate documentation for everything except for members annotated with a custom @Internal annotation - they should be hidden. The plugin will be tested with the following code: package org.jetbrains.dokka.internal.test annotation class Internal fun shouldBeVisible () {} @Internal fun shouldBeExcludedFromDocumentation () {} Expected behavior: function shouldBeExcludedFromDocumentation should not be visible in generated documentation. Full source code of this tutorial can be found in Dokka's examples under hide-internal-api . Preparing the project \u00b6 We'll begin by using Dokka plugin template . Press the Use this template button and open this project in IntelliJ IDEA . First, let's rename the pre-made template package and MyAwesomeDokkaPlugin class to something of our own. For instance, package can be renamed to org.example.dokka.plugin and the class to HideInternalApiPlugin : package org.example.dokka.plugin import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { } After you do that, make sure to update the path to this class in resources/META-INF/services/org.jetbrains.dokka.plugability.DokkaPlugin : org . example . dokka . plugin . HideInternalApiPlugin At this point you can also change project name in settings.gradle.kts (to hide-internal-api in our case) and groupId in build.gradle.kts . Extending Dokka \u00b6 After preparing the project we can begin extending Dokka with our own extension. Having read through Core extensions , it's clear that we need a PreMergeDocumentableTransformer extension in order to filter out undesired documentables. Moreover, the article mentioned a convenient abstract transformer SuppressedByConditionDocumentableFilterTransformer which is perfect for our use case, so we can try to implement it. Create a new class, place it next to your plugin and implement the abstract method. You should end up with this: package org.example.dokka.plugin import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () {} class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { return false } } Now we somehow need to find all annotations applied to d: Documentable and see if our @Internal annotation is present. However, it's not very clear how to do that. What usually helps is stopping in debugger and having a look at what fields and values a given Documentable has. To do that, we'll need to register our extension point first, then we can publish our plugin and set the breakpoint. Having read through Introduction to extensions , we now know how to register our extensions: class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } At this point we're ready to debug our plugin locally, it should already work, but do nothing. Debugging \u00b6 Please read through Debugging Dokka , it goes over the same steps in more detail and with examples. Below you will find rough instructions. First, let's begin by publishing our plugin to mavenLocal() . ./gradlew publishToMavenLocal This will publish your plugin under the groupId , artifactId and version that you've specified in your build.gradle.kts . In our case it's org.example:hide-internal-api:1.0-SNAPSHOT . Open a debug project of your choosing that has Dokka configured, and add our plugin to dependencies: dependencies { dokkaPlugin ( \"org.example:hide-internal-api:1.0-SNAPSHOT\" ) } Next, in that project let's run dokkaHtml with debug enabled: ./gradlew clean dokkaHtml -Dorg.gradle.debug = true --no-daemon Switch to the plugin project, set a breakpoint inside shouldBeSuppressed and run jvm remote debug. If you've done everything correctly, it should stop in debugger and you should be able to observe the values contained inside d: Documentable . Implementing plugin logic \u00b6 Now that we've stopped at our breakpoint, let's skip until we see shouldBeExcludedFromDocumentation function in the place of d: Documentable (observe the changing name property). Looking at what's inside the object, you might notice it has 3 values in extra , one of which is Annotations . Sounds like something we need! Having poked around, we come up with the following monstrosity of a code for determining if a given documentable has @Internal annotation (it can of course be refactored.. later): override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } Seems like we're done with writing our plugin and can begin testing it manually. Manual testing \u00b6 At this point, the implementation of your plugin should look roughly like this: package org.example.dokka.plugin import org.jetbrains.dokka.base.DokkaBase import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Annotations import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.model.properties.WithExtraProperties import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } } Bump plugin version in gradle.build.kts , publish it to maven local, open the debug project and run dokkaHtml (without debug this time). It should work, you should not be able to see shouldBeExcludedFromDocumentation function in generated documentation. Manual testing is cool and all, but wouldn't it be better if we could somehow write unit tests for it? Indeed! Unit testing \u00b6 You might've noticed that plugin template comes with a pre-made test class. Feel free to move it to another package and rename it. We are mostly interested in a single test case - functions annotated with @Internal should be hidden, while all other public functions should be visible. Plugin API comes with a set of convenient test utilities that are used to test Dokka itself, so it covers a wide range of use cases. When in doubt, see Dokka's tests for reference. Below you will find a complete unit test that passes, and the main takeaways below that. package org.example.dokka.plugin import org.jetbrains.dokka.base.testApi.testRunner.BaseAbstractTest import kotlin.test.Test import kotlin.test.assertEquals class HideInternalApiPluginTest : BaseAbstractTest () { @Test fun `should hide annotated functions` () { val configuration = dokkaConfiguration { sourceSets { sourceSet { sourceRoots = listOf ( \"src/main/kotlin/basic/Test.kt\" ) } } } val hideInternalPlugin = HideInternalApiPlugin () testInline ( \"\"\" |/src/main/kotlin/basic/Test.kt |package org.jetbrains.dokka.internal.test | |annotation class Internal | |fun shouldBeVisible() {} | |@Internal |fun shouldBeExcludedFromDocumentation() {} \"\"\" . trimMargin (), configuration = configuration , pluginOverrides = listOf ( hideInternalPlugin ) ) { preMergeDocumentablesTransformationStage = { modules -> val testModule = modules . single { it . name == \"root\" } val testPackage = testModule . packages . single { it . name == \"org.jetbrains.dokka.internal.test\" } val packageFunctions = testPackage . functions assertEquals ( 1 , packageFunctions . size ) assertEquals ( \"shouldBeVisible\" , packageFunctions [ 0 ] . name ) } } } } Note that the package of the tested code (inside testInline function) is the same as the package that we have hardcoded in our plugin. Make sure to change that to your own if you are following along, otherwise it will fail. Things to note and remember: Your test class should extend BaseAbstractTest , which contains base utility methods for testing. You can configure Dokka to your liking, enable some specific settings, configure source sets , etc. All done via dokkaConfiguration DSL. testInline function is the main entry point for unit tests You can pass plugins to be used in a test, notice pluginOverrides parameter You can write asserts for different stages of generating documentation, the main ones being Documentables model generation, Pages generation and Output generation. Since we implemented our plugin to work during PreMergeDocumentableTransformer stage, we can test it on the same level (that is preMergeDocumentablesTransformationStage ). You will need to write asserts using the model of whatever stage you choose. For Documentable transformation stage it's Documentable , for Page generation stage you would have Page model, and for Output you can have .html files that you will need to parse with JSoup (there are also utilities for that). Full source code of this tutorial can be found in Dokka's examples under hide-internal-api .","title":"Sample plugin tutorial"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#sample-plugin-tutorial","text":"We'll go over creating a simple plugin that covers a very common use case: generate documentation for everything except for members annotated with a custom @Internal annotation - they should be hidden. The plugin will be tested with the following code: package org.jetbrains.dokka.internal.test annotation class Internal fun shouldBeVisible () {} @Internal fun shouldBeExcludedFromDocumentation () {} Expected behavior: function shouldBeExcludedFromDocumentation should not be visible in generated documentation. Full source code of this tutorial can be found in Dokka's examples under hide-internal-api .","title":"Sample plugin tutorial"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#preparing-the-project","text":"We'll begin by using Dokka plugin template . Press the Use this template button and open this project in IntelliJ IDEA . First, let's rename the pre-made template package and MyAwesomeDokkaPlugin class to something of our own. For instance, package can be renamed to org.example.dokka.plugin and the class to HideInternalApiPlugin : package org.example.dokka.plugin import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { } After you do that, make sure to update the path to this class in resources/META-INF/services/org.jetbrains.dokka.plugability.DokkaPlugin : org . example . dokka . plugin . HideInternalApiPlugin At this point you can also change project name in settings.gradle.kts (to hide-internal-api in our case) and groupId in build.gradle.kts .","title":"Preparing the project"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#extending-dokka","text":"After preparing the project we can begin extending Dokka with our own extension. Having read through Core extensions , it's clear that we need a PreMergeDocumentableTransformer extension in order to filter out undesired documentables. Moreover, the article mentioned a convenient abstract transformer SuppressedByConditionDocumentableFilterTransformer which is perfect for our use case, so we can try to implement it. Create a new class, place it next to your plugin and implement the abstract method. You should end up with this: package org.example.dokka.plugin import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () {} class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { return false } } Now we somehow need to find all annotations applied to d: Documentable and see if our @Internal annotation is present. However, it's not very clear how to do that. What usually helps is stopping in debugger and having a look at what fields and values a given Documentable has. To do that, we'll need to register our extension point first, then we can publish our plugin and set the breakpoint. Having read through Introduction to extensions , we now know how to register our extensions: class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } At this point we're ready to debug our plugin locally, it should already work, but do nothing.","title":"Extending Dokka"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#debugging","text":"Please read through Debugging Dokka , it goes over the same steps in more detail and with examples. Below you will find rough instructions. First, let's begin by publishing our plugin to mavenLocal() . ./gradlew publishToMavenLocal This will publish your plugin under the groupId , artifactId and version that you've specified in your build.gradle.kts . In our case it's org.example:hide-internal-api:1.0-SNAPSHOT . Open a debug project of your choosing that has Dokka configured, and add our plugin to dependencies: dependencies { dokkaPlugin ( \"org.example:hide-internal-api:1.0-SNAPSHOT\" ) } Next, in that project let's run dokkaHtml with debug enabled: ./gradlew clean dokkaHtml -Dorg.gradle.debug = true --no-daemon Switch to the plugin project, set a breakpoint inside shouldBeSuppressed and run jvm remote debug. If you've done everything correctly, it should stop in debugger and you should be able to observe the values contained inside d: Documentable .","title":"Debugging"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#implementing-plugin-logic","text":"Now that we've stopped at our breakpoint, let's skip until we see shouldBeExcludedFromDocumentation function in the place of d: Documentable (observe the changing name property). Looking at what's inside the object, you might notice it has 3 values in extra , one of which is Annotations . Sounds like something we need! Having poked around, we come up with the following monstrosity of a code for determining if a given documentable has @Internal annotation (it can of course be refactored.. later): override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } Seems like we're done with writing our plugin and can begin testing it manually.","title":"Implementing plugin logic"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#manual-testing","text":"At this point, the implementation of your plugin should look roughly like this: package org.example.dokka.plugin import org.jetbrains.dokka.base.DokkaBase import org.jetbrains.dokka.base.transformers.documentables.SuppressedByConditionDocumentableFilterTransformer import org.jetbrains.dokka.model.Annotations import org.jetbrains.dokka.model.Documentable import org.jetbrains.dokka.model.properties.WithExtraProperties import org.jetbrains.dokka.plugability.DokkaContext import org.jetbrains.dokka.plugability.DokkaPlugin class HideInternalApiPlugin : DokkaPlugin () { val myFilterExtension by extending { plugin < DokkaBase > (). preMergeDocumentableTransformer providing :: HideInternalApiTransformer } } class HideInternalApiTransformer ( context : DokkaContext ) : SuppressedByConditionDocumentableFilterTransformer ( context ) { override fun shouldBeSuppressed ( d : Documentable ): Boolean { val annotations : List < Annotations . Annotation > = ( d as? WithExtraProperties <*> ) ?. extra ?. allOfType < Annotations > () ?. flatMap { it . directAnnotations . values . flatten () } ?: emptyList () return annotations . any { isInternalAnnotation ( it ) } } private fun isInternalAnnotation ( annotation : Annotations . Annotation ): Boolean { return annotation . dri . packageName == \"org.jetbrains.dokka.internal.test\" && annotation . dri . classNames == \"Internal\" } } Bump plugin version in gradle.build.kts , publish it to maven local, open the debug project and run dokkaHtml (without debug this time). It should work, you should not be able to see shouldBeExcludedFromDocumentation function in generated documentation. Manual testing is cool and all, but wouldn't it be better if we could somehow write unit tests for it? Indeed!","title":"Manual testing"},{"location":"developer_guide/plugin-development/sample-plugin-tutorial/#unit-testing","text":"You might've noticed that plugin template comes with a pre-made test class. Feel free to move it to another package and rename it. We are mostly interested in a single test case - functions annotated with @Internal should be hidden, while all other public functions should be visible. Plugin API comes with a set of convenient test utilities that are used to test Dokka itself, so it covers a wide range of use cases. When in doubt, see Dokka's tests for reference. Below you will find a complete unit test that passes, and the main takeaways below that. package org.example.dokka.plugin import org.jetbrains.dokka.base.testApi.testRunner.BaseAbstractTest import kotlin.test.Test import kotlin.test.assertEquals class HideInternalApiPluginTest : BaseAbstractTest () { @Test fun `should hide annotated functions` () { val configuration = dokkaConfiguration { sourceSets { sourceSet { sourceRoots = listOf ( \"src/main/kotlin/basic/Test.kt\" ) } } } val hideInternalPlugin = HideInternalApiPlugin () testInline ( \"\"\" |/src/main/kotlin/basic/Test.kt |package org.jetbrains.dokka.internal.test | |annotation class Internal | |fun shouldBeVisible() {} | |@Internal |fun shouldBeExcludedFromDocumentation() {} \"\"\" . trimMargin (), configuration = configuration , pluginOverrides = listOf ( hideInternalPlugin ) ) { preMergeDocumentablesTransformationStage = { modules -> val testModule = modules . single { it . name == \"root\" } val testPackage = testModule . packages . single { it . name == \"org.jetbrains.dokka.internal.test\" } val packageFunctions = testPackage . functions assertEquals ( 1 , packageFunctions . size ) assertEquals ( \"shouldBeVisible\" , packageFunctions [ 0 ] . name ) } } } } Note that the package of the tested code (inside testInline function) is the same as the package that we have hardcoded in our plugin. Make sure to change that to your own if you are following along, otherwise it will fail. Things to note and remember: Your test class should extend BaseAbstractTest , which contains base utility methods for testing. You can configure Dokka to your liking, enable some specific settings, configure source sets , etc. All done via dokkaConfiguration DSL. testInline function is the main entry point for unit tests You can pass plugins to be used in a test, notice pluginOverrides parameter You can write asserts for different stages of generating documentation, the main ones being Documentables model generation, Pages generation and Output generation. Since we implemented our plugin to work during PreMergeDocumentableTransformer stage, we can test it on the same level (that is preMergeDocumentablesTransformationStage ). You will need to write asserts using the model of whatever stage you choose. For Documentable transformation stage it's Documentable , for Page generation stage you would have Page model, and for Output you can have .html files that you will need to parse with JSoup (there are also utilities for that). Full source code of this tutorial can be found in Dokka's examples under hide-internal-api .","title":"Unit testing"}]}
\ No newline at end of file
diff --git a/2.0.0-SNAPSHOT/sitemap.xml b/2.0.0-SNAPSHOT/sitemap.xml
index 32b6cc9be6..e4f7730e71 100644
--- a/2.0.0-SNAPSHOT/sitemap.xml
+++ b/2.0.0-SNAPSHOT/sitemap.xml
@@ -2,72 +2,72 @@
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/introduction/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/workflow/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/architecture_overview/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/data_model/documentable_model/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/data_model/extra/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/data_model/page_content/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/extension_points/base_plugin/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/extension_points/core_extension_points/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/extension_points/extension_points/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/architecture/extension_points/generation_implementations/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/community/slack/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/plugin-development/introduction/
- 2024-01-24
+ 2024-03-04
daily
https://github.com/Kotlin/dokka/2.0.0-SNAPSHOT/developer_guide/plugin-development/sample-plugin-tutorial/
- 2024-01-24
+ 2024-03-04
daily
\ No newline at end of file
diff --git a/2.0.0-SNAPSHOT/sitemap.xml.gz b/2.0.0-SNAPSHOT/sitemap.xml.gz
index 0d8709fe95..b467040cc9 100644
Binary files a/2.0.0-SNAPSHOT/sitemap.xml.gz and b/2.0.0-SNAPSHOT/sitemap.xml.gz differ