You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Update the Flink docs to use the correct Iceberg metadata-table syntax, clean up the generic CREATE CATALOG example, and replace hardcoded runtime jar examples with version-templated names. Also fix the PyFlink install snippet and several wording issues in the writing and reading sections.
Co-authored-by: Codex <codex@openai.com>
PyFlink 1.6.1 [does not work on OSX with a M1 cpu](https://issues.apache.org/jira/browse/FLINK-28786)
114
+
PyFlink 1.6.1 has a known issue on macOS with Apple Silicon. See [FLINK-28786](https://issues.apache.org/jira/browse/FLINK-28786).
115
115
116
116
Install the Apache Flink dependency using `pip`:
117
117
118
-
```python
118
+
```bash
119
119
pip install apache-flink=={{ flinkVersion }}
120
120
```
121
121
@@ -175,20 +175,20 @@ Run a query:
175
175
176
176
For more details, please refer to the [Python Table API](https://ci.apache.org/projects/flink/flink-docs-release-{{ flinkVersionMajor }}/docs/dev/python/table/intro_to_table_api/).
177
177
178
-
## Adding catalogs.
178
+
## Adding catalogs
179
179
180
-
Flink support to create catalogs by using Flink SQL.
180
+
Flink supports creating catalogs using Flink SQL.
181
181
182
182
### Catalog Configuration
183
183
184
184
A catalog is created and named by executing the following query (replace `<catalog_name>` with your catalog name and
185
-
`<config_key>`=`<config_value>` with catalog implementation config):
185
+
`'<config_key>'='<config_value>'` with catalog implementation config):
186
186
187
187
```sql
188
188
CREATE CATALOG <catalog_name> WITH (
189
189
'type'='iceberg',
190
-
`<config_key>`=`<config_value>`
191
-
);
190
+
'<config_key>'='<config_value>'
191
+
);
192
192
```
193
193
194
194
The following properties can be set globally and are not limited to a specific catalog implementation:
@@ -245,15 +245,15 @@ INSERT INTO `hive_catalog`.`default`.`sample` VALUES (1, 'a');
245
245
INSERT INTO`hive_catalog`.`default`.`sample`SELECT id, data from other_kafka_table;
246
246
```
247
247
248
-
To replace data in the table with the result of a query, use `INSERT OVERWRITE` in batch job (flink streaming job does not support `INSERT OVERWRITE`). Overwrites are atomic operations for Iceberg tables.
248
+
To replace data in the table with the result of a query, use `INSERT OVERWRITE` in a batch job (Flink streaming jobs do not support `INSERT OVERWRITE`). Overwrites are atomic operations for Iceberg tables.
249
249
250
250
Partitions that have rows produced by the SELECT query will be replaced, for example:
0 commit comments