-
Processors
- AttributeRollingWindow
- AttributesToCSV
- AttributesToJSON
- CalculateRecordStats
- CaptureChangeMySQL
- CompressContent
- ConnectWebSocket
- ConsumeAMQP
- ConsumeAzureEventHub
- ConsumeElasticsearch
- ConsumeGCPubSub
- ConsumeIMAP
- ConsumeJMS
- ConsumeKafka
- ConsumeKinesisStream
- ConsumeMQTT
- ConsumePOP3
- ConsumeSlack
- ConsumeTwitter
- ConsumeWindowsEventLog
- ControlRate
- ConvertCharacterSet
- ConvertRecord
- CopyAzureBlobStorage_v12
- CopyS3Object
- CountText
- CryptographicHashContent
- DebugFlow
- DecryptContentAge
- DecryptContentPGP
- DeduplicateRecord
- DeleteAzureBlobStorage_v12
- DeleteAzureDataLakeStorage
- DeleteByQueryElasticsearch
- DeleteDynamoDB
- DeleteFile
- DeleteGCSObject
- DeleteGridFS
- DeleteMongo
- DeleteS3Object
- DeleteSFTP
- DeleteSQS
- DetectDuplicate
- DistributeLoad
- DuplicateFlowFile
- EncodeContent
- EncryptContentAge
- EncryptContentPGP
- EnforceOrder
- EvaluateJsonPath
- EvaluateXPath
- EvaluateXQuery
- ExecuteGroovyScript
- ExecuteProcess
- ExecuteScript
- ExecuteSQL
- ExecuteSQLRecord
- ExecuteStreamCommand
- ExtractAvroMetadata
- ExtractEmailAttachments
- ExtractEmailHeaders
- ExtractGrok
- ExtractHL7Attributes
- ExtractRecordSchema
- ExtractText
- FetchAzureBlobStorage_v12
- FetchAzureDataLakeStorage
- FetchBoxFile
- FetchDistributedMapCache
- FetchDropbox
- FetchFile
- FetchFTP
- FetchGCSObject
- FetchGoogleDrive
- FetchGridFS
- FetchS3Object
- FetchSFTP
- FetchSmb
- FilterAttribute
- FlattenJson
- ForkEnrichment
- ForkRecord
- GenerateFlowFile
- GenerateRecord
- GenerateTableFetch
- GeoEnrichIP
- GeoEnrichIPRecord
- GeohashRecord
- GetAsanaObject
- GetAwsPollyJobStatus
- GetAwsTextractJobStatus
- GetAwsTranscribeJobStatus
- GetAwsTranslateJobStatus
- GetAzureEventHub
- GetAzureQueueStorage_v12
- GetDynamoDB
- GetElasticsearch
- GetFile
- GetFTP
- GetGcpVisionAnnotateFilesOperationStatus
- GetGcpVisionAnnotateImagesOperationStatus
- GetHubSpot
- GetMongo
- GetMongoRecord
- GetS3ObjectMetadata
- GetSFTP
- GetShopify
- GetSmbFile
- GetSNMP
- GetSplunk
- GetSQS
- GetWorkdayReport
- GetZendesk
- HandleHttpRequest
- HandleHttpResponse
- IdentifyMimeType
- InvokeHTTP
- InvokeScriptedProcessor
- ISPEnrichIP
- JoinEnrichment
- JoltTransformJSON
- JoltTransformRecord
- JSLTTransformJSON
- JsonQueryElasticsearch
- ListAzureBlobStorage_v12
- ListAzureDataLakeStorage
- ListBoxFile
- ListDatabaseTables
- ListDropbox
- ListenFTP
- ListenHTTP
- ListenOTLP
- ListenSlack
- ListenSyslog
- ListenTCP
- ListenTrapSNMP
- ListenUDP
- ListenUDPRecord
- ListenWebSocket
- ListFile
- ListFTP
- ListGCSBucket
- ListGoogleDrive
- ListS3
- ListSFTP
- ListSmb
- LogAttribute
- LogMessage
- LookupAttribute
- LookupRecord
- MergeContent
- MergeRecord
- ModifyBytes
- ModifyCompression
- MonitorActivity
- MoveAzureDataLakeStorage
- Notify
- PackageFlowFile
- PaginatedJsonQueryElasticsearch
- ParseEvtx
- ParseNetflowv5
- ParseSyslog
- ParseSyslog5424
- PartitionRecord
- PublishAMQP
- PublishGCPubSub
- PublishJMS
- PublishKafka
- PublishMQTT
- PublishSlack
- PutAzureBlobStorage_v12
- PutAzureCosmosDBRecord
- PutAzureDataExplorer
- PutAzureDataLakeStorage
- PutAzureEventHub
- PutAzureQueueStorage_v12
- PutBigQuery
- PutBoxFile
- PutCloudWatchMetric
- PutDatabaseRecord
- PutDistributedMapCache
- PutDropbox
- PutDynamoDB
- PutDynamoDBRecord
- PutElasticsearchJson
- PutElasticsearchRecord
- PutEmail
- PutFile
- PutFTP
- PutGCSObject
- PutGoogleDrive
- PutGridFS
- PutKinesisFirehose
- PutKinesisStream
- PutLambda
- PutMongo
- PutMongoBulkOperations
- PutMongoRecord
- PutRecord
- PutRedisHashRecord
- PutS3Object
- PutSalesforceObject
- PutSFTP
- PutSmbFile
- PutSNS
- PutSplunk
- PutSplunkHTTP
- PutSQL
- PutSQS
- PutSyslog
- PutTCP
- PutUDP
- PutWebSocket
- PutZendeskTicket
- QueryAirtableTable
- QueryAzureDataExplorer
- QueryDatabaseTable
- QueryDatabaseTableRecord
- QueryRecord
- QuerySalesforceObject
- QuerySplunkIndexingStatus
- RemoveRecordField
- RenameRecordField
- ReplaceText
- ReplaceTextWithMapping
- RetryFlowFile
- RouteHL7
- RouteOnAttribute
- RouteOnContent
- RouteText
- RunMongoAggregation
- SampleRecord
- ScanAttribute
- ScanContent
- ScriptedFilterRecord
- ScriptedPartitionRecord
- ScriptedTransformRecord
- ScriptedValidateRecord
- SearchElasticsearch
- SegmentContent
- SendTrapSNMP
- SetSNMP
- SignContentPGP
- SplitAvro
- SplitContent
- SplitExcel
- SplitJson
- SplitPCAP
- SplitRecord
- SplitText
- SplitXml
- StartAwsPollyJob
- StartAwsTextractJob
- StartAwsTranscribeJob
- StartAwsTranslateJob
- StartGcpVisionAnnotateFilesOperation
- StartGcpVisionAnnotateImagesOperation
- TagS3Object
- TailFile
- TransformXml
- UnpackContent
- UpdateAttribute
- UpdateByQueryElasticsearch
- UpdateCounter
- UpdateDatabaseTable
- UpdateRecord
- ValidateCsv
- ValidateJson
- ValidateRecord
- ValidateXml
- VerifyContentMAC
- VerifyContentPGP
- Wait
-
Controller Services
- ADLSCredentialsControllerService
- ADLSCredentialsControllerServiceLookup
- AmazonGlueSchemaRegistry
- ApicurioSchemaRegistry
- AvroReader
- AvroRecordSetWriter
- AvroSchemaRegistry
- AWSCredentialsProviderControllerService
- AzureBlobStorageFileResourceService
- AzureCosmosDBClientService
- AzureDataLakeStorageFileResourceService
- AzureEventHubRecordSink
- AzureStorageCredentialsControllerService_v12
- AzureStorageCredentialsControllerServiceLookup_v12
- CEFReader
- ConfluentEncodedSchemaReferenceReader
- ConfluentEncodedSchemaReferenceWriter
- ConfluentSchemaRegistry
- CSVReader
- CSVRecordLookupService
- CSVRecordSetWriter
- DatabaseRecordLookupService
- DatabaseRecordSink
- DatabaseTableSchemaRegistry
- DBCPConnectionPool
- DBCPConnectionPoolLookup
- DistributedMapCacheLookupService
- ElasticSearchClientServiceImpl
- ElasticSearchLookupService
- ElasticSearchStringLookupService
- EmailRecordSink
- EmbeddedHazelcastCacheManager
- ExcelReader
- ExternalHazelcastCacheManager
- FreeFormTextRecordSetWriter
- GCPCredentialsControllerService
- GCSFileResourceService
- GrokReader
- HazelcastMapCacheClient
- HikariCPConnectionPool
- HttpRecordSink
- IPLookupService
- JettyWebSocketClient
- JettyWebSocketServer
- JMSConnectionFactoryProvider
- JndiJmsConnectionFactoryProvider
- JsonConfigBasedBoxClientService
- JsonPathReader
- JsonRecordSetWriter
- JsonTreeReader
- Kafka3ConnectionService
- KerberosKeytabUserService
- KerberosPasswordUserService
- KerberosTicketCacheUserService
- LoggingRecordSink
- MapCacheClientService
- MapCacheServer
- MongoDBControllerService
- MongoDBLookupService
- PropertiesFileLookupService
- ProtobufReader
- ReaderLookup
- RecordSetWriterLookup
- RecordSinkServiceLookup
- RedisConnectionPoolService
- RedisDistributedMapCacheClientService
- RestLookupService
- S3FileResourceService
- ScriptedLookupService
- ScriptedReader
- ScriptedRecordSetWriter
- ScriptedRecordSink
- SetCacheClientService
- SetCacheServer
- SimpleCsvFileLookupService
- SimpleDatabaseLookupService
- SimpleKeyValueLookupService
- SimpleRedisDistributedMapCacheClientService
- SimpleScriptedLookupService
- SiteToSiteReportingRecordSink
- SlackRecordSink
- SmbjClientProviderService
- StandardAsanaClientProviderService
- StandardAzureCredentialsControllerService
- StandardDropboxCredentialService
- StandardFileResourceService
- StandardHashiCorpVaultClientService
- StandardHttpContextMap
- StandardJsonSchemaRegistry
- StandardKustoIngestService
- StandardKustoQueryService
- StandardOauth2AccessTokenProvider
- StandardPGPPrivateKeyService
- StandardPGPPublicKeyService
- StandardPrivateKeyService
- StandardProxyConfigurationService
- StandardRestrictedSSLContextService
- StandardS3EncryptionService
- StandardSSLContextService
- StandardWebClientServiceProvider
- Syslog5424Reader
- SyslogReader
- UDPEventRecordSink
- VolatileSchemaCache
- WindowsEventLogReader
- XMLFileLookupService
- XMLReader
- XMLRecordSetWriter
- YamlTreeReader
- ZendeskRecordSink
XMLRecordSetWriter 2.0.0
- Bundle
- org.apache.nifi | nifi-record-serialization-services-nar
- Description
- Writes a RecordSet to XML. The records are wrapped by a root tag.
- Tags
- record, recordset, resultset, row, serialize, writer, xml
- Input Requirement
- Supports Sensitive Dynamic Properties
- false
-
Additional Details for XMLRecordSetWriter 2.0.0
XMLRecordSetWriter
The XMLRecordSetWriter Controller Service writes record objects to XML. The Controller Service must be configured with a schema that describes the structure of the record objects. Multiple records are wrapped by a root node. The name of the root node can be configured via property. If no root node is configured, the writer expects only one record for each FlowFile (that will not be wrapped). As Avro does not support defining attributes for records, this writer currently does not support writing XML attributes.
Example: Simple records
RecordSet ( Record ( Field "name1" = "value1", Field "name2" = 42 ), Record ( Field "name1" = "value2", Field "name2" = 84 ) )
This record can be described by the following schema:
{ "name": "test", "namespace": "nifi", "type": "record", "fields": [ { "name": "name1", "type": "string" }, { "name": "name2", "type": "int" } ] }
Assuming that “root_name” has been configured as the name for the root node and “record_name” has been configured as the name for the record nodes, the writer will write the following XML:
<root_name> <record_name> <name1>value1</name1> <name2>42</name2> </record_name> <record_name> <name1>value2</name1> <name2>84</name2> </record_name> </root_name>
The writer furthermore can be configured how to treat null or missing values in records:
RecordSet ( Record ( Field "name1" = "value1", Field "name2" = null ), Record ( Field "name1" = "value2", ) )
If the writer is configured always to suppress missing or null values, only the field of name “name1” will appear in the XML. If the writer ist configured only to suppress missing values, the field of name “name2” will appear in the XML as a node without content for the first record. If the writer is configured never to suppress anything, the field of name " name2" will appear in the XML as a node without content for both records.
Example: Arrays
The writer furthermore can be configured how to write arrays:
RecordSet ( Record ( Field "name1" = "value1", Field "array_field" = [ 1, 2, 3 ] ) )
This record can be described by the following schema:
{ "name": "test", "namespace": "nifi", "type": "record", "fields": [ { "name": "array_field", "type": { "type": "array", "items": int } }, { "name": "name1", "type": "string" } ] }
If the writer is configured not to wrap arrays, it will transform the record to the following XML:
<root_name> <record_name> <name1>value1</name1> <array_field>1</array_field> <array_field>2</array_field> <array_field>3</array_field> </record_name> </root_name>
If the writer is configured to wrap arrays using the field name as wrapper and “elem” as tag name for element nodes, it will transform the record to the following XML:
<root_name> <record_name> <name1>value1</field2> <array_field> <elem>1</elem> <elem>2</elem> <elem>3</elem> </array_field> </record_name> </root_name>
If the writer is configured to wrap arrays using “wrap” as wrapper and the field name as tag name for element nodes, it will transform the record to the following XML:
<root_name> <record_name> <name1>value1</field2> <wrap> <array_field>1</array_field> <array_field>2</array_field> <array_field>3</array_field> </wrap> </record_name> </root_name>
-
Array Tag Name
Name of the tag used by property "Wrap Elements of Arrays" to write arrays
- Display Name
- Array Tag Name
- Description
- Name of the tag used by property "Wrap Elements of Arrays" to write arrays
- API Name
- array_tag_name
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
-
Wrap Elements of Arrays
Specifies how the writer wraps elements of fields of type array
- Display Name
- Wrap Elements of Arrays
- Description
- Specifies how the writer wraps elements of fields of type array
- API Name
- array_wrapping
- Default Value
- no-wrapping
- Allowable Values
-
- Use Property as Wrapper
- Use Property for Elements
- No Wrapping
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Character Set
The Character set to use when writing the data to the FlowFile
- Display Name
- Character Set
- Description
- The Character set to use when writing the data to the FlowFile
- API Name
- Character Set
- Default Value
- UTF-8
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Date Format
Specifies the format to use when reading/writing Date fields. If not specified, Date fields will be assumed to be number of milliseconds since epoch (Midnight, Jan 1, 1970 GMT). If specified, the value must match the Java java.time.format.DateTimeFormatter format (for example, MM/dd/yyyy for a two-digit month, followed by a two-digit day, followed by a four-digit year, all separated by '/' characters, as in 01/01/2017).
- Display Name
- Date Format
- Description
- Specifies the format to use when reading/writing Date fields. If not specified, Date fields will be assumed to be number of milliseconds since epoch (Midnight, Jan 1, 1970 GMT). If specified, the value must match the Java java.time.format.DateTimeFormatter format (for example, MM/dd/yyyy for a two-digit month, followed by a two-digit day, followed by a four-digit year, all separated by '/' characters, as in 01/01/2017).
- API Name
- Date Format
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
-
Omit XML Declaration
Specifies whether or not to include XML declaration
- Display Name
- Omit XML Declaration
- Description
- Specifies whether or not to include XML declaration
- API Name
- omit_xml_declaration
- Default Value
- false
- Allowable Values
-
- true
- false
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Pretty Print XML
Specifies whether or not the XML should be pretty printed
- Display Name
- Pretty Print XML
- Description
- Specifies whether or not the XML should be pretty printed
- API Name
- pretty_print_xml
- Default Value
- false
- Allowable Values
-
- true
- false
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Name of Record Tag
Specifies the name of the XML record tag wrapping the record fields. If this is not set, the writer will use the record name in the schema.
- Display Name
- Name of Record Tag
- Description
- Specifies the name of the XML record tag wrapping the record fields. If this is not set, the writer will use the record name in the schema.
- API Name
- record_tag_name
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
-
Name of Root Tag
Specifies the name of the XML root tag wrapping the record set. This property has to be defined if the writer is supposed to write multiple records in a single FlowFile.
- Display Name
- Name of Root Tag
- Description
- Specifies the name of the XML root tag wrapping the record set. This property has to be defined if the writer is supposed to write multiple records in a single FlowFile.
- API Name
- root_tag_name
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
-
Schema Reference Writer
Service implementation responsible for writing FlowFile attributes or content header with Schema reference information
- Display Name
- Schema Reference Writer
- Description
- Service implementation responsible for writing FlowFile attributes or content header with Schema reference information
- API Name
- Schema Reference Writer
- Service Interface
- org.apache.nifi.schemaregistry.services.SchemaReferenceWriter
- Service Implementations
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
- Dependencies
-
- Schema Write Strategy is set to any of [schema-reference-writer]
-
Schema Write Strategy
Specifies how the schema for a Record should be added to the data.
- Display Name
- Schema Write Strategy
- Description
- Specifies how the schema for a Record should be added to the data.
- API Name
- Schema Write Strategy
- Default Value
- no-schema
- Allowable Values
-
- Do Not Write Schema
- Set 'schema.name' Attribute
- Set 'avro.schema' Attribute
- Schema Reference Writer
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Schema Access Strategy
Specifies how to obtain the schema that is to be used for interpreting the data.
- Display Name
- Schema Access Strategy
- Description
- Specifies how to obtain the schema that is to be used for interpreting the data.
- API Name
- schema-access-strategy
- Default Value
- inherit-record-schema
- Allowable Values
-
- Inherit Record Schema
- Use 'Schema Name' Property
- Use 'Schema Text' Property
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Schema Branch
Specifies the name of the branch to use when looking up the schema in the Schema Registry property. If the chosen Schema Registry does not support branching, this value will be ignored.
- Display Name
- Schema Branch
- Description
- Specifies the name of the branch to use when looking up the schema in the Schema Registry property. If the chosen Schema Registry does not support branching, this value will be ignored.
- API Name
- schema-branch
- Expression Language Scope
- Environment variables and FlowFile Attributes
- Sensitive
- false
- Required
- false
- Dependencies
-
- Schema Access Strategy is set to any of [schema-name]
-
Schema Cache
Specifies a Schema Cache to add the Record Schema to so that Record Readers can quickly lookup the schema.
- Display Name
- Schema Cache
- Description
- Specifies a Schema Cache to add the Record Schema to so that Record Readers can quickly lookup the schema.
- API Name
- schema-cache
- Service Interface
- org.apache.nifi.serialization.RecordSchemaCacheService
- Service Implementations
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
-
Schema Name
Specifies the name of the schema to lookup in the Schema Registry property
- Display Name
- Schema Name
- Description
- Specifies the name of the schema to lookup in the Schema Registry property
- API Name
- schema-name
- Default Value
- ${schema.name}
- Expression Language Scope
- Environment variables and FlowFile Attributes
- Sensitive
- false
- Required
- false
- Dependencies
-
- Schema Access Strategy is set to any of [schema-name]
-
Schema Reference Reader
Service implementation responsible for reading FlowFile attributes or content to determine the Schema Reference Identifier
- Display Name
- Schema Reference Reader
- Description
- Service implementation responsible for reading FlowFile attributes or content to determine the Schema Reference Identifier
- API Name
- schema-reference-reader
- Service Interface
- org.apache.nifi.schemaregistry.services.SchemaReferenceReader
- Service Implementations
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
- Dependencies
-
- Schema Access Strategy is set to any of [schema-reference-reader]
-
Schema Registry
Specifies the Controller Service to use for the Schema Registry
- Display Name
- Schema Registry
- Description
- Specifies the Controller Service to use for the Schema Registry
- API Name
- schema-registry
- Service Interface
- org.apache.nifi.schemaregistry.services.SchemaRegistry
- Service Implementations
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
- Dependencies
-
- Schema Access Strategy is set to any of [schema-reference-reader, schema-name]
-
Schema Text
The text of an Avro-formatted Schema
- Display Name
- Schema Text
- Description
- The text of an Avro-formatted Schema
- API Name
- schema-text
- Default Value
- ${avro.schema}
- Expression Language Scope
- Environment variables and FlowFile Attributes
- Sensitive
- false
- Required
- false
- Dependencies
-
- Schema Access Strategy is set to any of [schema-text-property]
-
Schema Version
Specifies the version of the schema to lookup in the Schema Registry. If not specified then the latest version of the schema will be retrieved.
- Display Name
- Schema Version
- Description
- Specifies the version of the schema to lookup in the Schema Registry. If not specified then the latest version of the schema will be retrieved.
- API Name
- schema-version
- Expression Language Scope
- Environment variables and FlowFile Attributes
- Sensitive
- false
- Required
- false
- Dependencies
-
- Schema Access Strategy is set to any of [schema-name]
-
Suppress Null Values
Specifies how the writer should handle a null field
- Display Name
- Suppress Null Values
- Description
- Specifies how the writer should handle a null field
- API Name
- suppress_nulls
- Default Value
- never-suppress
- Allowable Values
-
- Never Suppress
- Always Suppress
- Suppress Missing Values
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- true
-
Time Format
Specifies the format to use when reading/writing Time fields. If not specified, Time fields will be assumed to be number of milliseconds since epoch (Midnight, Jan 1, 1970 GMT). If specified, the value must match the Java java.time.format.DateTimeFormatter format (for example, HH:mm:ss for a two-digit hour in 24-hour format, followed by a two-digit minute, followed by a two-digit second, all separated by ':' characters, as in 18:04:15).
- Display Name
- Time Format
- Description
- Specifies the format to use when reading/writing Time fields. If not specified, Time fields will be assumed to be number of milliseconds since epoch (Midnight, Jan 1, 1970 GMT). If specified, the value must match the Java java.time.format.DateTimeFormatter format (for example, HH:mm:ss for a two-digit hour in 24-hour format, followed by a two-digit minute, followed by a two-digit second, all separated by ':' characters, as in 18:04:15).
- API Name
- Time Format
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false
-
Timestamp Format
Specifies the format to use when reading/writing Timestamp fields. If not specified, Timestamp fields will be assumed to be number of milliseconds since epoch (Midnight, Jan 1, 1970 GMT). If specified, the value must match the Java java.time.format.DateTimeFormatter format (for example, MM/dd/yyyy HH:mm:ss for a two-digit month, followed by a two-digit day, followed by a four-digit year, all separated by '/' characters; and then followed by a two-digit hour in 24-hour format, followed by a two-digit minute, followed by a two-digit second, all separated by ':' characters, as in 01/01/2017 18:04:15).
- Display Name
- Timestamp Format
- Description
- Specifies the format to use when reading/writing Timestamp fields. If not specified, Timestamp fields will be assumed to be number of milliseconds since epoch (Midnight, Jan 1, 1970 GMT). If specified, the value must match the Java java.time.format.DateTimeFormatter format (for example, MM/dd/yyyy HH:mm:ss for a two-digit month, followed by a two-digit day, followed by a four-digit year, all separated by '/' characters; and then followed by a two-digit hour in 24-hour format, followed by a two-digit minute, followed by a two-digit second, all separated by ':' characters, as in 01/01/2017 18:04:15).
- API Name
- Timestamp Format
- Expression Language Scope
- Not Supported
- Sensitive
- false
- Required
- false