Skip to content

Commit 2c80661

Browse files
authored
Merge pull request #58 from dzikosc/main
Update examples to 1.20
2 parents f784057 + 2d1a03a commit 2c80661

File tree

31 files changed

+51
-51
lines changed

31 files changed

+51
-51
lines changed

java/CustomMetrics/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Custom Metrics
22

3-
* Flink version: 1.19
3+
* Flink version: 1.20
44
* Flink API: DataStream API
55
* Language: Java (11)
66

java/CustomMetrics/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<flink.connector.version>4.3.0-1.19</flink.connector.version>
2121
<kda.runtime.version>1.2.0</kda.runtime.version>
2222
<log4j.version>2.23.1</log4j.version>

java/GettingStarted/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Skeleton project for a basic Flink Java application to run on Amazon Managed Service for Apache Flink.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: DataStream API
77
* Language: Java (11)
88

java/GettingStarted/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<aws.connector.version>4.3.0-1.19</aws.connector.version>
2121
<kda.runtime.version>1.2.0</kda.runtime.version>
2222
<log4j.version>2.23.1</log4j.version>

java/GettingStartedTable/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Example of project for a basic Flink Java application using the Table API & SQL in combination with the DataStream API.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: Table API & SQL, and DataStream API
77
* Language: Java (11)
88

java/GettingStartedTable/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<kda.runtime.version>1.2.0</kda.runtime.version>
2121
<log4j.version>2.23.1</log4j.version>
2222
</properties>

java/KafkaConnectors/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Flink Kafka Source & Sink Examples
22

3-
* Flink version: 1.19
3+
* Flink version: 1.20
44
* Flink API: DataStream API
55
* Language: Java (11)
66

java/KafkaConnectors/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<kafka.connector.version>3.2.0-1.19</kafka.connector.version>
2121
<kda.runtime.version>1.2.0</kda.runtime.version>
2222
<log4j.version>2.17.2</log4j.version>

java/KinesisConnectors/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Flink Kinesis Source & Sink examples (standard and EFO)
22

3-
* Flink version: 1.19
3+
* Flink version: 1.20
44
* Flink API: DataStream API
55
* Language: Java (11)
66

java/KinesisConnectors/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<aws.connector.version>4.3.0-1.19</aws.connector.version>
2121
<kda.runtime.version>1.2.0</kda.runtime.version>
2222
<log4j.version>2.23.1</log4j.version>

java/KinesisFirehoseSink/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Flink Kinesis Firehose Sink example
22

3-
* Flink version: 1.19
3+
* Flink version: 1.20
44
* Flink API: DataStream API
55
* Language: Java (11)
66

java/KinesisFirehoseSink/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<flink.connector.version>4.3.0-1.19</flink.connector.version>
2121
<kda.runtime.version>1.2.0</kda.runtime.version>
2222
<log4j.version>2.23.1</log4j.version>

java/S3Sink/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# S3 Sink
22

3-
* Flink version: 1.19
3+
* Flink version: 1.20
44
* Flink API: DataStream API
55
* Language Java (11)
66

java/S3Sink/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
<target.java.version>11</target.java.version>
1616
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1717
<maven.compiler.target>${target.java.version}</maven.compiler.target>
18-
<flink.version>1.19.1</flink.version>
18+
<flink.version>1.20.0</flink.version>
1919
<flink.connector.version>4.3.0-1.19</flink.connector.version>
2020
<kda.runtime.version>1.2.0</kda.runtime.version>
2121
<log4j.version>2.23.1</log4j.version>

java/Windowing/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Example of project for a basic Flink Java application using data aggregation in Time based Windows.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: DataStream API
77
* Flink Connectors: Kinesis Connector
88
* Language: Java (11)

java/Windowing/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
<target.java.version>11</target.java.version>
1717
<maven.compiler.source>${target.java.version}</maven.compiler.source>
1818
<maven.compiler.target>${target.java.version}</maven.compiler.target>
19-
<flink.version>1.19.1</flink.version>
19+
<flink.version>1.20.0</flink.version>
2020
<aws.connector.version>4.3.0-1.19</aws.connector.version>
2121
<kda.runtime.version>1.2.0</kda.runtime.version>
2222
<log4j.version>2.23.1</log4j.version>

python/FirehoseSink/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Example showing how to send data to Amazon Data Firehose from a PyFlink application.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: Table API & SQL
77
* Flink Connectors: Firehose Connector
88
* Language: Python
@@ -17,12 +17,12 @@ Random data are generated internally, by the application.
1717
#### Development and build environment requirements
1818

1919
* Python 3.11
20-
* PyFlink library: `apache-flink==1.19.1`
20+
* PyFlink library: `apache-flink==1.20.0`
2121
* Java JDK 11+ and Maven
2222

2323
> ⚠️ As of 2024-06-27, the Flink Python library 1.19.x may fail installing on Python 3.12.
2424
> We recommend using Python 3.11 for development, the same Python version used by Amazon Managed Service for Apache Flink
25-
> runtime 1.19.
25+
> runtime 1.20.
2626
2727

2828
> JDK and Maven are uses to download and package any required Flink dependencies, e.g. connectors, and
@@ -124,7 +124,7 @@ Follow this process to make changes to the Python code
124124

125125
### Application structure
126126

127-
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/datagen/) connector.
127+
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/connectors/table/datagen/) connector.
128128
No external data generator is required.
129129

130130
Records are sent to a Firehose delivery stream, as JSON, without any transformations.

python/FirehoseSink/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
<buildDirectory>${project.basedir}/target</buildDirectory>
1010
<zip.finalName>${project.name}-${project.version}</zip.finalName>
1111
<jar.finalName>pyflink-dependencies</jar.finalName>
12-
<flink.version>1.19.1</flink.version>
12+
<flink.version>1.20.0</flink.version>
1313
<aws.connector.version>4.3.0-1.19</aws.connector.version>
1414
<kda.runtime.version>1.2.0</kda.runtime.version>
1515
</properties>

python/GettingStarted/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Sample PyFlink application reading from and writing to Kinesis Data Stream.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: Table API & SQL
77
* Flink Connectors: Kinesis Connector
88
* Language: Python
@@ -22,13 +22,13 @@ The job can run both on Amazon Managed Service for Apache Flink, and locally for
2222
#### Development and build environment requirements
2323

2424
* Python 3.11
25-
* PyFlink library: `apache-flink==1.19.1`
25+
* PyFlink library: `apache-flink==1.20.0`
2626
* Java JDK 11+ and Maven
2727

2828

2929
> ⚠️ As of 2024-06-27, the Flink Python library 1.19.x may fail installing on Python 3.12.
3030
> We recommend using Python 3.11 for development, the same Python version used by Amazon Managed Service for Apache Flink
31-
> runtime 1.19.
31+
> runtime 1.20.
3232
3333
> JDK and Maven are used to download and package any required Flink dependencies, e.g. connectors, and
3434
to package the application as `.zip` file, for deployment to Amazon Managed Service for Apache Flink.

python/GettingStarted/main.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,7 @@ def main():
121121

122122
# Some trick is required to generate the string defining the initial position, depending on the configuration
123123
# See Flink documentation for further details about configuring a Kinesis source table
124-
# https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/kinesis/
124+
# https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/connectors/table/kinesis/
125125
init_pos = "\n'scan.stream.initpos' = '{0}',".format(input_stream_initpos) if input_stream_initpos is not None else ''
126126
init_pos_timestamp = "\n'scan.stream.initpos-timestamp' = '{0}',".format(input_stream_initpos_timestamp) if input_stream_initpos_timestamp is not None else ''
127127

python/GettingStarted/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
<buildDirectory>${project.basedir}/target</buildDirectory>
1313
<zip.finalName>${project.name}-${project.version}</zip.finalName>
1414
<jar.finalName>pyflink-dependencies</jar.finalName>
15-
<flink.version>1.19.1</flink.version>
15+
<flink.version>1.20.0</flink.version>
1616
<aws.connector.version>4.3.0-1.19</aws.connector.version>
1717
<kda.runtime.version>1.2.0</kda.runtime.version>
1818
</properties>

python/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,11 +27,11 @@ Thre [Python Dependencies](./PythonDependencies/) example shows the most general
2727

2828
There are some known issues with some specific Python and PyFlink versions, for local development
2929

30-
* We recommend using **Python 3.11** to develop Python Flink 1.19.1 applications.
30+
* We recommend using **Python 3.11** to develop Python Flink 1.20.0 applications.
3131
This is also the runtime used by Amazon Managed Service for Apache Flink.
3232
Installation of the Python Flink 1.19 library on Python 3.12 may fail.
3333
* Installation of the Python Flink **1.15** library on machines based on **Apple Silicon** fail.
34-
We recommend upgrading to the Flink 1.19 or 1.18. Versions 1.18+ work correctly also on Apple Silicon machines.
34+
We recommend upgrading to the Flink 1.20, 1.19 or 1.18. Versions 1.18+ work correctly also on Apple Silicon machines.
3535
If you need to maintain a Flink 1.15 application using a machine based on Apple Silicon, you can follow [the guide to develop Flink 1.15 on Apple Silicon](LocalDevelopmentOnAppleSilicon).
3636

3737

python/S3Sink/README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Example showing a PyFlink application writing to S3.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: Table API & SQL
77
* Flink Connectors: FileSystem (S3)
88
* Language: Python
@@ -20,13 +20,13 @@ The job can run both on Amazon Managed Service for Apache Flink, and locally for
2020
#### Development and build environment requirements
2121

2222
* Python 3.11
23-
* PyFlink library: `apache-flink==1.19.1`
23+
* PyFlink library: `apache-flink==1.20.0`
2424
* Java JDK 11+ and Maven
2525

2626

2727
> ⚠️ As of 2024-06-27, the Flink Python library 1.19.x may fail installing on Python 3.12.
2828
> We recommend using Python 3.11 for development, the same Python version used by Amazon Managed Service for Apache Flink
29-
> runtime 1.19.
29+
> runtime 1.20.
3030
3131
> JDK and Maven are required to download and package any required Flink dependencies, e.g. connectors, and
3232
to package the application as `.zip` file, for deployment to Amazon Managed Service for Apache Flink.
@@ -95,13 +95,13 @@ and copy it in the directory where PyFlink is installed.
9595
```
9696
$ python -c "import pyflink;import os;print(os.path.dirname(os.path.abspath(pyflink.__file__)))"
9797
```
98-
2. For Flink 1.19, download `flink-s3-fs-hadoop-1.19.1.jar` (the latest as of 2024-06-27)
99-
from [this link](https://repo1.maven.org/maven2/org/apache/flink/flink-s3-fs-hadoop/1.19.1/flink-s3-fs-hadoop-1.19.1.jar).
98+
2. For Flink 1.20, download `flink-s3-fs-hadoop-1.20.0.jar` (the latest as of 2024-08-27)
99+
from [this link](https://repo1.maven.org/maven2/org/apache/flink/flink-s3-fs-hadoop/1.20.0/flink-s3-fs-hadoop-1.20.0.jar).
100100
If you are using e different Flink version, download the plugin for the correct version
101101
(see [available plugin versions](https://mvnrepository.com/artifact/org.apache.flink/flink-s3-fs-hadoop)).
102102
3. Copy it into the `<flink-home>/lib/` directory
103103

104-
> Note: [Flink documentation](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/deployment/filesystems/plugins/#file-systems)
104+
> Note: [Flink documentation](https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/deployment/filesystems/plugins/#file-systems)
105105
> currently contains an error, stating that you need to install this dependency in the `<flink-home>/plugins` directory instead.
106106
107107

@@ -151,7 +151,7 @@ Follow this process to make changes to the Python code
151151

152152
### Application structure
153153

154-
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/datagen/) connector.
154+
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/connectors/table/datagen/) connector.
155155
No external data generator is required.
156156

157157
Generated records are written to a destination table, writing to S3.

python/S3Sink/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
<buildDirectory>${project.basedir}/target</buildDirectory>
1313
<zip.finalName>${project.name}-${project.version}</zip.finalName>
1414
<jar.finalName>pyflink-dependencies</jar.finalName>
15-
<flink.version>1.19.1</flink.version>
15+
<flink.version>1.20.0</flink.version>
1616
<kda.runtime.version>1.2.0</kda.runtime.version>
1717
</properties>
1818

python/UDF/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,12 @@
22

33
Example showing how to implement and use a User Defined Function (UDF) in PyFlink.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: Table API & SQL
77
* Flink Connectors: Kinesis Connector
88
* Language: Python
99

10-
The application demonstrates the implementation of [User Defined Functions](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/python/table/udfs/overview/)
10+
The application demonstrates the implementation of [User Defined Functions](https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/dev/python/table/udfs/overview/)
1111
in PyFlink.
1212
Random data are generated internally, by the application. The result is sent to a Kinesis Data Stream.
1313

@@ -18,13 +18,13 @@ Random data are generated internally, by the application. The result is sent to
1818
#### Development and build environment requirements
1919

2020
* Python 3.11
21-
* PyFlink library: `apache-flink==1.19.1`
21+
* PyFlink library: `apache-flink==1.20.0`
2222
* Java JDK 11+ and Maven
2323

2424

2525
> ⚠️ As of 2024-06-27, the Flink Python library 1.19.x may fail installing on Python 3.12.
2626
> We recommend using Python 3.11 for development, the same Python version used by Amazon Managed Service for Apache Flink
27-
> runtime 1.19.
27+
> runtime 1.20.
2828
2929
> JDK and Maven are uses to download and package any required Flink dependencies, e.g. connectors, and
3030
to package the application as `.zip` file, for deployment to Amazon Managed Service for Apache Flink.
@@ -116,7 +116,7 @@ Follow this process to make changes to the Python code
116116

117117
### Application structure
118118

119-
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/datagen/) connector.
119+
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/connectors/table/datagen/) connector.
120120
No external data generator is required.
121121

122122
It demonstrates writing a query that uses a UDF, implemented in Python, and send the results to a Kinesis Data Stream.

python/UDF/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
<buildDirectory>${project.basedir}/target</buildDirectory>
1313
<zip.finalName>${project.name}-${project.version}</zip.finalName>
1414
<jar.finalName>pyflink-dependencies</jar.finalName>
15-
<flink.version>1.19.1</flink.version>
15+
<flink.version>1.20.0</flink.version>
1616
<aws.connector.version>4.3.0-1.19</aws.connector.version>
1717
<kda.runtime.version>1.2.0</kda.runtime.version>
1818
</properties>

python/Windowing/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Example showing a basic PyFlink job doing data aggregation over time windows.
44

5-
* Flink version: 1.19
5+
* Flink version: 1.20
66
* Flink API: Table API & SQL
77
* Flink Connectors: Kinesis Connector
88
* Language: Python
@@ -23,15 +23,15 @@ The job can run both on Amazon Managed Service for Apache Flink, and locally for
2323
#### Development and build environment requirements
2424

2525
* Python 3.11
26-
* PyFlink library: `apache-flink==1.19.1`
26+
* PyFlink library: `apache-flink==1.20.0`
2727
* Java JDK 11+ and Maven
2828

2929
> JDK and Maven are used to download and package any required Flink dependencies, e.g. connectors, and
3030
to package the application as `.zip` file, for deployment to Amazon Managed Service for Apache Flink.
3131

3232
> ⚠️ As of 2024-06-27, the Flink Python library 1.19.x may fail installing on Python 3.12.
3333
> We recommend using Python 3.11 for development, the same Python version used by Amazon Managed Service for Apache Flink
34-
> runtime 1.19.
34+
> runtime 1.20.
3535
3636
#### External dependencies
3737

@@ -133,7 +133,7 @@ Follow this process to make changes to the Python code
133133

134134
### Application structure
135135

136-
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/connectors/table/datagen/) connector.
136+
The application generates synthetic data using the [DataGen](https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/connectors/table/datagen/) connector.
137137
No external data generator is required.
138138

139139
It demonstrates 4 types of windowing aggregations:

python/Windowing/main.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -141,7 +141,7 @@ def main():
141141
# the set.
142142
# Note that this only applies to INSERT statements. You can add as many CREATE TABLE as needed using execute_sql().
143143
# Only INSERT causes the job execution to be triggered.
144-
# See: https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/table/sql/insert/#insert-statement
144+
# See: https://nightlies.apache.org/flink/flink-docs-release-1.20/docs/dev/table/sql/insert/#insert-statement
145145

146146
stmt_set = table_env.create_statement_set()
147147

0 commit comments

Comments
 (0)