Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with converting json -> avro -> json for union of records #14

Open
kpiska opened this issue Mar 15, 2017 · 3 comments
Open

Problem with converting json -> avro -> json for union of records #14

kpiska opened this issue Mar 15, 2017 · 3 comments

Comments

@kpiska
Copy link

kpiska commented Mar 15, 2017

Hi,

why this test below failed?

Is it proper when json with "fieldB" is converting to other type of record (recordA) with default "fieldA" value?

    def 'should not fail'() {

        given:
        def schema = '''
        {
          "type": "record",
          "name": "testSchema",
          "fields": [
            {
              "name": "someMainField",
              "type": [
                {
                  "type": "record",
                  "name": "recordA",
                  "fields": [
                    {
                      "name": "fieldA",
                      "type": "string",
                      "default": ""
                    }
                  ]
                },
                {
                  "type": "record",
                  "name": "recordB",
                  "fields": [
                    {
                      "name": "fieldB",
                      "type": "string",
                      "default": ""
                    }
                  ]
                }
              ]
            }
          ]
        }
        '''

        def json = '''
        {  
           "someMainField":{  
              "fieldB":"B"
           }
        }
        '''

        when:
        def result = toMap(converter.convertToJson(converter.convertToAvro(json.bytes, schema), schema))

        then:
        !(result["someMainField"] as Map).containsKey("fieldA") // why fail?
        (result["someMainField"] as Map).containsKey("fieldB") // why fail?
    }
@rprzystasz
Copy link

rprzystasz commented Mar 22, 2017

This converter will not map complex Unions, due to the fact that basic JSON payload doesn't provide type information for union fields.

There is a pull request (allegro/hermes#749) in Hermes, which introduces Avro encoded JSON converter.

Switching between conversion methods would be controlled by the Content-Type request header. application/json for current conversion, and avro/json for Avro encoded payload.

For latter, correct request would look like:

{
    "someMainField":{
        "recordB":{
            "fieldB":"B"
        }
    }
}

NOTE
If schema has a namespace defined, it must be prepended in request for custom type definitions (records, etc.)

So if the schema had namespace: pl.allegro.test
The payload would be:

{
    "someMainField":{
        "pl.allegro.test.recordB":{
            "fieldB":"B"
        }
    }
}

@vspiliopoulos
Copy link

@rprzystasz but I think the question has two legs:

  1. json to avro object (which is answered)
  2. avro object to json that includes types for union types.

I am trying to do step (2). Any help more than appreciated!

@vspiliopoulos
Copy link

worked out easier than I thought:

    public static <T extends SpecificRecord> String getJsonString(T record) throws IOException {
	  ByteArrayOutputStream os = new ByteArrayOutputStream();
	  JsonEncoder encoder = EncoderFactory.get().jsonEncoder(record.getSchema(), os);
	  DatumWriter<T> writer = new GenericDatumWriter<T>();
	  if (record instanceof SpecificRecord) {
	    writer = new SpecificDatumWriter<T>();
	  }

	  writer.setSchema(record.getSchema());
	  writer.write(record, encoder);
	  encoder.flush();
	  String jsonString = new String(os.toByteArray(), Charset.forName("UTF-8"));
	  os.close();
	  return jsonString;
	}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants