public abstract class Compatibility extends Object
Constructor and Description |
---|
Compatibility() |
Modifier and Type | Method and Description |
---|---|
abstract List<DirectInputFragment> |
computeInputFragments(AbstractParquetFileFormat<?> format,
StripedDataFormat.InputContext context)
delegate from
StripedDataFormat.computeInputFragments(InputContext) . |
abstract <T> ModelInput<T> |
createInput(AbstractParquetFileFormat<T> format,
Class<? extends T> dataType,
org.apache.hadoop.fs.FileSystem fileSystem,
org.apache.hadoop.fs.Path path,
long offset,
long fragmentSize,
Counter counter)
|
abstract <T> ModelOutput<T> |
createOutput(AbstractParquetFileFormat<T> format,
Class<? extends T> dataType,
org.apache.hadoop.fs.FileSystem fileSystem,
org.apache.hadoop.fs.Path path,
Counter counter)
|
abstract Optional<?> |
findValueDriver(org.apache.hadoop.hive.serde2.typeinfo.TypeInfo typeInfo,
Class<?> valueClass)
delegate to
ParquetValueDrivers.find() . |
abstract Optional<? extends Enum<?>> |
findVersionId(String name)
returns the version symbol if the name.
|
abstract Class<? extends Enum<?>> |
getCompressionCodecNameClass()
returns the compression kind class.
|
static Compatibility |
getInstance()
returns the compatibility instance for this environment.
|
protected abstract int |
getPriority()
returns the priority of this compatibility layer.
|
public static Compatibility getInstance()
protected abstract int getPriority()
public abstract List<DirectInputFragment> computeInputFragments(AbstractParquetFileFormat<?> format, StripedDataFormat.InputContext context) throws IOException, InterruptedException
StripedDataFormat.computeInputFragments(InputContext)
.format
- the source formatcontext
- the current input contextIOException
- if failed to compute fragments by I/O errorInterruptedException
- if interrupted while computing fragmentspublic abstract <T> ModelInput<T> createInput(AbstractParquetFileFormat<T> format, Class<? extends T> dataType, org.apache.hadoop.fs.FileSystem fileSystem, org.apache.hadoop.fs.Path path, long offset, long fragmentSize, Counter counter) throws IOException, InterruptedException
T
- the data typeformat
- the source formatdataType
- the target data typefileSystem
- the file system to open the target pathpath
- the path to the target fileoffset
- starting stream offsetfragmentSize
- suggested fragment bytes count, or -1
as infinitecounter
- the current counterIOException
- if failed to create readerInterruptedException
- if interruptedIllegalArgumentException
- if this does not support target property sequence,
or any parameter is null
public abstract <T> ModelOutput<T> createOutput(AbstractParquetFileFormat<T> format, Class<? extends T> dataType, org.apache.hadoop.fs.FileSystem fileSystem, org.apache.hadoop.fs.Path path, Counter counter) throws IOException, InterruptedException
T
- the data typeformat
- the source formatdataType
- the target data typefileSystem
- the file system to open the target pathpath
- the path to the target filecounter
- the current counterIOException
- if failed to create writerInterruptedException
- if interruptedIllegalArgumentException
- if this does not support property sequence,
or any parameter is null
public abstract Optional<? extends Enum<?>> findVersionId(String name)
name
- the version namepublic abstract Class<? extends Enum<?>> getCompressionCodecNameClass()
public abstract Optional<?> findValueDriver(org.apache.hadoop.hive.serde2.typeinfo.TypeInfo typeInfo, Class<?> valueClass)
ParquetValueDrivers.find()
.typeInfo
- the Hive type infovalueClass
- the ValueOption
typeParquetValueDriver
, or empty
if it is not foundCopyright © 2011–2019 Asakusa Framework Team. All rights reserved.