Skip to content

Commit e1eb8c9

Browse files
authored
Update github links for Kafka Live Viewer Profiles tutorial (#1134)
1 parent b89692b commit e1eb8c9

File tree

3 files changed

+21
-21
lines changed

3 files changed

+21
-21
lines changed

tutorials/kafka-live-viewer-profiles/deploy-aws-terraform.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,9 @@ The following [Steps](#steps) will deploy the solution accelerator to AWS using
1515

1616
1. Open a terminal.
1717
2. Install **Docker** and **Docker Compose**.
18-
3. [Clone the project](https://github.com/snowplow-incubator/live-viewer-profiles) and navigate to its directory.
18+
3. [Clone the project](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles) and navigate to its directory.
1919
```bash
20-
git clone https://github.com/snowplow-incubator/live-viewer-profiles.git
20+
git clone https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles.git
2121
```
2222
4. Create a `.env` file based on `.env.example` and configure AWS variables.
2323
```bash
@@ -58,7 +58,7 @@ $ ./up.sh # <- start the Docker containers
5858

5959
### Step 4: Open Access to the Applications
6060

61-
Review the [LocalStack guide](/tutorials/kafka-live-viewer-profiles/quickstart-localstack) for the default configuration for each component. Open public access to the two frontend applications and the Snowplow Collector using a HTTP load balancer so that anyone can watch the video, submit events to the pipeline, and see information on concurrent users.
61+
Review the [LocalStack guide](/tutorials/kafka-live-viewer-profiles/quickstart-localstack) for the default configuration for each component. Open public access to the two frontend applications and the Snowplow Collector using a HTTP load balancer so that anyone can watch the video, submit events to the pipeline, and see information on concurrent users.
6262

6363
The applications listen for HTTP traffic on the following ports
6464
- Web tracker front end - 3000
@@ -67,7 +67,7 @@ The applications listen for HTTP traffic on the following ports
6767

6868
## Next Steps
6969
- You can implement Snowplow media tracking on any [HTML5](/docs/sources/trackers/javascript-trackers/web-tracker/tracking-events/media/html5/) or [YouTube](/docs/sources/trackers/javascript-trackers/web-tracker/tracking-events/media/youtube/) media of your choice
70-
- Look into the output from Kafka and extend the Live Viewer to include information on the media being watched and the user.
70+
- Look into the output from Kafka and extend the Live Viewer to include information on the media being watched and the user.
7171
- Replace Amazon DynamoDB with an alternative to be cloud agnostic, e.g. Google Bigtable or MongoDB.
7272
---
7373

tutorials/kafka-live-viewer-profiles/introduction.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,34 +4,34 @@ title: Introduction
44
---
55

66
## About This Accelerator
7-
Welcome to the **live viewer profiles** solution accelerator for video streaming!
7+
Welcome to the **live viewer profiles** solution accelerator for video streaming!
88

9-
This accelerator demonstrates how to build a real-time use case leveraging **Snowplow event data** to create live viewer profiles for a video streaming site. By combining Snowplow's streaming pipeline with **Apache Kafka**, a **Java application** and **AWS DynamoDB**, the solution processes live streaming events to visualize user interactions with video content and advertisements.
9+
This accelerator demonstrates how to build a real-time use case leveraging **Snowplow event data** to create live viewer profiles for a video streaming site. By combining Snowplow's streaming pipeline with **Apache Kafka**, a **Java application** and **AWS DynamoDB**, the solution processes live streaming events to visualize user interactions with video content and advertisements.
1010

11-
On the left side of the image below we have someone watching a video. Their events are sent through a Snowplow pipeline to Kafka where they are consumed and processed by an application. The result of this processing is displayed in the right window. This shows the number of active users and their current state.
11+
On the left side of the image below we have someone watching a video. Their events are sent through a Snowplow pipeline to Kafka where they are consumed and processed by an application. The result of this processing is displayed in the right window. This shows the number of active users and their current state.
1212

1313
![Application Output](images/one-viewer.png)
1414

1515
Through this hands-on guide, you’ll learn how to build, deploy, and extend real-time, event-driven architectures using Snowplow and Kafka, enabling personalized recommendations, real-time insights, and dynamic analytics for streaming platforms. The framework is inspired by common challenges in video streaming, including tracking user behavior, ad engagement, and session activities, with the goal of maintaining up-to-date viewer profiles in DynamoDB.
1616

1717
This accelerator is open source and can serve as the foundation to build practical applications like real-time viewer insights, engagement analytics, ad performance tracking, and personalized recommendations. Whether you're optimizing ad placements or enhancing viewer satisfaction, this guide equips you to unlock the full potential of Snowplow event data.
1818

19-
Please start by reviewing how the application works in the next page on Localstack, even if you're planning to deploy with Terraform.
19+
Please start by reviewing how the application works in the next page on Localstack, even if you're planning to deploy with Terraform.
2020

2121
---
2222

2323
## Solution Accelerator Code
24-
[**The code for this infrastructure is available on here on GitHub.**](https://github.com/snowplow-incubator/live-viewer-profiles)
24+
[**The code for this infrastructure is available on here on GitHub.**](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles)
2525

2626
---
2727

2828
## Architecture
2929

3030
The solution comprises several interconnected components:
3131

32-
- **Web Tracking Application**:
32+
- **Web Tracking Application**:
3333
- A React application with a video to watch. Snowplow's media tracking has been configured to send events (e.g., play, pause, ad skipped) to the [Snowplow Collector](/docs/fundamentals/architecture-overview).
34-
- Code available in [tracker-frontend](https://github.com/snowplow-incubator/live-viewer-profiles/tree/main/tracker-frontend) folder in GitHub
34+
- Code available in [tracker-frontend](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles/tree/main/tracker-frontend) folder in GitHub
3535

3636
- **Snowplow Collector**:
3737
- Collects and forwards events via [Stream Enrich](/docs/fundamentals/architecture-overview) and Kinesis to [Snowbridge](/docs/destinations/forwarding-events/snowbridge).
@@ -41,17 +41,17 @@ The solution comprises several interconnected components:
4141

4242
- **Live Viewer Backend**:
4343
- A Java application which processes events from Kafka, stores the data in DynamoDB, and generates JSON state data for the Live Viewer Frontend
44-
- Code available in [live-viewer-backend](https://github.com/snowplow-incubator/live-viewer-profiles/tree/main/live-viewer-backend) folder in GitHub
44+
- Code available in [live-viewer-backend](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles/tree/main/live-viewer-backend) folder in GitHub
4545

4646
- **Live Viewer Frontend**:
4747
- A HTML website which displays the state of users currently watching the video.
48-
- Code available in [live-viewer-frontend](https://github.com/snowplow-incubator/live-viewer-profiles/tree/main/live-viewer-frontend) folder in GitHub
48+
- Code available in [live-viewer-frontend](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles/tree/main/live-viewer-frontend) folder in GitHub
4949

50-
The following diagram maps out where each component sits in the end to end communication flow.
50+
The following diagram maps out where each component sits in the end to end communication flow.
5151
![Architecture Diagram](images/architecture.png)
5252

5353
### Components & Configuration
54-
The following files in the [GitHub repository](https://github.com/snowplow-incubator/live-viewer-profiles) can be used to configure the project's components.
54+
The following files in the [GitHub repository](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles) can be used to configure the project's components.
5555
- **Snowplow components**: `compose.snowplow.yaml`
5656
- **Kafka infrastructure**: `compose.kafka.yaml`
5757
- **Application components**: `compose.apps.yaml`

tutorials/kafka-live-viewer-profiles/quickstart-localstack.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,9 @@ title: Quickstart with Localstack
1111

1212
1. Open a terminal.
1313
2. Install **Docker** and **Docker Compose**.
14-
3. [Clone the project](https://github.com/snowplow-incubator/live-viewer-profiles) and navigate to its directory.
14+
3. [Clone the project](https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles) and navigate to its directory.
1515
```bash
16-
git clone https://github.com/snowplow-incubator/live-viewer-profiles.git
16+
git clone https://github.com/snowplow-industry-solutions/kafka-live-viewer-profiles.git
1717
```
1818
4. Create a `.env` file based on `.env.example`. You can leave the AWS variables as placeholders when using Localstack
1919
```bash
@@ -39,11 +39,11 @@ Details on everything that is installed can be found in [architecture](/tutorial
3939

4040
### Step 2: Open the Web Tracking Frontend
4141

42-
Visit [http://localhost:3000](http://localhost:3000) to configure the Stream Collector endpoint and start tracking events. Enter the Collector URL: `localhost:9090` and click `Create tracker`.
42+
Visit [http://localhost:3000](http://localhost:3000) to configure the Stream Collector endpoint and start tracking events. Enter the Collector URL: `localhost:9090` and click `Create tracker`.
4343

4444
![First page of tracking website](images/tracker-demo.png)
4545

46-
On the next screen, click `Custom media tracking demo`. This will bring up a video and a screen that displays information on what events are sent from the browser to the pipeline. If you want to simulate multiple users watching the video at the same time, you can open this in separate browsers.
46+
On the next screen, click `Custom media tracking demo`. This will bring up a video and a screen that displays information on what events are sent from the browser to the pipeline. If you want to simulate multiple users watching the video at the same time, you can open this in separate browsers.
4747

4848
![Welcome page on tracking website](images/welcome-page.png)
4949

@@ -61,7 +61,7 @@ Congratulations! You have successfully run the accelerator to stream web behavio
6161

6262
## Next Steps
6363
- You can implement Snowplow media tracking on any [HTML5](/docs/sources/trackers/javascript-trackers/web-tracker/tracking-events/media/html5/) or [YouTube](/docs/sources/trackers/javascript-trackers/web-tracker/tracking-events/media/youtube/) media of your choice
64-
- Look into the output from Kafka and extend the Live Viewer to include information on the media being watched and the user.
64+
- Look into the output from Kafka and extend the Live Viewer to include information on the media being watched and the user.
6565
- Use our supplied Terraform in the next section to run this on AWS and make it publicly available.
6666

6767
## Other Things You Can Do
@@ -80,7 +80,7 @@ sudo ./lazydocker.sh
8080

8181
### Inspect Infrastructure with LocalStack UI
8282

83-
Visit the [LocalStack UI](https://app.localstack.cloud/) to inspect infrastructure components such as Kinesis and DynamoDB. Please note that a LocalStack account is required to view this.
83+
Visit the [LocalStack UI](https://app.localstack.cloud/) to inspect infrastructure components such as Kinesis and DynamoDB. Please note that a LocalStack account is required to view this.
8484

8585
## Cleaning Up
8686

0 commit comments

Comments
 (0)