Package | Description |
---|---|
org.apache.hadoop.contrib.index.example | |
org.apache.hadoop.contrib.index.mapred | |
org.apache.hadoop.contrib.utils.join | |
org.apache.hadoop.examples |
Hadoop example code.
|
org.apache.hadoop.examples.dancing |
This package is a distributed implementation of Knuth's dancing links
algorithm that can run under Hadoop.
|
org.apache.hadoop.examples.terasort |
This package consists of 3 map/reduce applications for Hadoop to
compete in the annual terabyte sort
competition.
|
org.apache.hadoop.filecache | |
org.apache.hadoop.mapred |
A software framework for easily writing applications which process vast
amounts of data (multi-terabyte data-sets) parallelly on large clusters
(thousands of nodes) built of commodity hardware in a reliable, fault-tolerant
manner.
|
org.apache.hadoop.mapred.jobcontrol |
Utilities for managing dependent jobs.
|
org.apache.hadoop.mapred.join |
Given a set of sorted datasets keyed with the same class and yielding equal
partitions, it is possible to effect a join of those datasets prior to the map.
|
org.apache.hadoop.mapred.lib |
Library of generally useful mappers, reducers, and partitioners.
|
org.apache.hadoop.mapred.lib.aggregate |
Classes for performing various counting and aggregations.
|
org.apache.hadoop.mapred.lib.db |
org.apache.hadoop.mapred.lib.db Package
|
org.apache.hadoop.mapred.pipes |
Hadoop Pipes allows C++ code to use Hadoop DFS and map/reduce.
|
org.apache.hadoop.mapreduce | |
org.apache.hadoop.mapreduce.server.jobtracker | |
org.apache.hadoop.mapreduce.server.tasktracker | |
org.apache.hadoop.mapreduce.server.tasktracker.userlogs | |
org.apache.hadoop.mapreduce.split | |
org.apache.hadoop.mapreduce.task | |
org.apache.hadoop.streaming |
Hadoop Streaming is a utility which allows users to create and run
Map-Reduce jobs with any executables (e.g.
|
org.apache.hadoop.util |
Class and Description |
---|
FileInputFormat
A base class for file-based
InputFormat . |
FileSplit
A section of an input file.
|
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
OutputCollector |
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
FileOutputFormat
A base class for
OutputFormat . |
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapReduceBase |
OutputCollector |
OutputFormat
OutputFormat describes the output-specification for a
Map-Reduce job. |
Partitioner
Partitions the key space.
|
RecordWriter
RecordWriter writes the output <key, value> pairs
to an output file. |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
OutputCollector |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapReduceBase |
OutputCollector |
Partitioner
Partitions the key space.
|
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
RunningJob
RunningJob is the user-interface to query for details on a
running Map-Reduce job. |
Class and Description |
---|
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapReduceBase |
OutputCollector |
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
FileInputFormat
A base class for file-based
InputFormat . |
FileOutputFormat
A base class for
OutputFormat . |
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapReduceBase |
OutputCollector |
OutputFormat
OutputFormat describes the output-specification for a
Map-Reduce job. |
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
RecordWriter
RecordWriter writes the output <key, value> pairs
to an output file. |
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
TextOutputFormat
An
OutputFormat that writes plain text files. |
Class and Description |
---|
InvalidJobConfException
This exception is thrown when jobconf misses some mendatory attributes
or value of some attributes is invalid.
|
TaskController
Controls initialization, finalization and clean up of tasks, and
also the launching and killing of task JVMs.
|
Class and Description |
---|
AdminOperationsProtocol
Protocol for admin operations.
|
CleanupQueue |
CleanupQueue.PathDeletionContext
Contains info related to the path of the file/dir to be deleted
|
ClusterStatus
Status information on the current state of the Map-Reduce cluster.
|
Counters
Deprecated.
Use
Counters instead. |
Counters.Counter
Deprecated.
A counter record, comprising its name and value.
|
Counters.Group
Deprecated.
Group of counters, comprising of counters from a particular
counter Enum class. |
FileAlreadyExistsException
Used when target file already exists for any operation and
is not configured to be overwritten.
|
FileInputFormat
A base class for file-based
InputFormat . |
FileOutputFormat
A base class for
OutputFormat . |
FileSplit
A section of an input file.
|
IndexRecord |
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
InvalidJobConfException
This exception is thrown when jobconf misses some mendatory attributes
or value of some attributes is invalid.
|
JobClient.TaskStatusFilter |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
JobContext |
JobHistory.JobInfo
Helper class for logging or reading back events related to job start, finish or failure.
|
JobHistory.Keys
Job history files contain key="value" pairs, where keys belong to this enum.
|
JobHistory.Listener
Callback interface for reading back log events from JobHistory.
|
JobHistory.RecordTypes
Record types are identifiers for each line of log in history files.
|
JobHistory.Task
Helper class for logging or reading back events related to Task's start, finish or failure.
|
JobHistory.TaskAttempt
Base class for Map and Reduce TaskAttempts.
|
JobHistory.Values
This enum contains some of the values commonly used by history log events.
|
JobID
JobID represents the immutable and unique identifier for
the job.
|
JobInProgress
JobInProgress maintains all the info for keeping
a Job on the straight and narrow.
|
JobInProgress.Counter |
JobPriority
Used to describe the priority of the running job.
|
JobProfile
A JobProfile is a MapReduce primitive.
|
JobQueueInfo
Class that contains the information regarding the Job Queues which are
maintained by the Hadoop Map/Reduce framework.
|
JobStatus
Describes the current status of a job.
|
JobTracker
JobTracker is the central location for submitting and
tracking MR jobs in a network environment.
|
JobTracker.State |
JobTrackerHADaemon.JobTrackerRunner |
JobTrackerHAServiceProtocol |
JobTrackerMXBean
The MXBean interface for JobTrackerInfo
|
JobTrackerProxies.ProxyAndInfo
Wrapper for a client proxy as well as its associated service ID.
|
JTProtocols |
JvmTask |
MapOutputCollector |
MapOutputCollector.Context |
MapOutputFile
Manipulate the working area for the transient store for maps and reduces.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapRunnable
Expert: Generic interface for
Mapper s. |
MapTask
A Map task.
|
MapTaskCompletionEventsUpdate
A class that represents the communication between the tasktracker and child
tasks w.r.t the map task completion events.
|
Operation
Generic operation that maps to the dependent set of ACLs that drive the
authorization of the operation.
|
OutputCollector |
OutputCommitter
OutputCommitter describes the commit of task output for a
Map-Reduce job. |
OutputFormat
OutputFormat describes the output-specification for a
Map-Reduce job. |
Partitioner
Partitions the key space.
|
RawKeyValueIterator
RawKeyValueIterator is an iterator used to iterate over
the raw keys and values during sort/merge of intermediate data. |
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
RecordWriter
RecordWriter writes the output <key, value> pairs
to an output file. |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
ReduceTask
A Reduce task.
|
ReduceTask.ReduceCopier.MapOutput
Describes the output of a map; could either be on disk or in-memory.
|
ReduceTask.ReduceCopier.MapOutputCopier
Copies map outputs as they become available
|
ReduceTask.ReduceCopier.MapOutputLocation
Abstraction to track a map-output.
|
ReduceTask.ReduceCopier.ShuffleClientMetrics
This class contains the methods that should be used for metrics-reporting
the specific metrics for shuffle.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
RunningJob
RunningJob is the user-interface to query for details on a
running Map-Reduce job. |
SequenceFileInputFilter.Filter
filter interface
|
SequenceFileInputFilter.FilterBase
base class for Filters
|
SequenceFileInputFormat
An
InputFormat for SequenceFile s. |
SequenceFileOutputFormat
An
OutputFormat that writes SequenceFile s. |
ShuffleConsumerPlugin |
ShuffleConsumerPlugin.Context |
ShuffleProviderPlugin
This interface is implemented by objects that are able to answer shuffle requests which are
sent from a matching Shuffle Consumer that lives in context of a ReduceTask object.
|
Task
Base class for tasks.
|
Task.CombinerRunner |
Task.Counter |
Task.TaskReporter |
TaskAttemptContext
Deprecated.
Use
TaskAttemptContext
instead. |
TaskAttemptID
TaskAttemptID represents the immutable and unique identifier for
a task attempt.
|
TaskCompletionEvent
This is used to track task completion events on
job tracker.
|
TaskCompletionEvent.Status |
TaskController
Controls initialization, finalization and clean up of tasks, and
also the launching and killing of task JVMs.
|
TaskID
TaskID represents the immutable and unique identifier for
a Map or Reduce Task.
|
TaskInProgress
TaskInProgress maintains all the info needed for a
Task in the lifetime of its owning Job.
|
TaskLog.LogName
The filter for userlogs.
|
TaskReport
A report on the state of a task.
|
TaskStatus
Describes the current status of a task.
|
TaskStatus.Phase |
TaskStatus.State |
TaskTracker
TaskTracker is a process that starts and tracks MR Tasks
in a networked environment.
|
TaskTrackerMXBean
MXBean interface for TaskTracker
|
TaskTrackerStatus
A TaskTrackerStatus is a MapReduce primitive.
|
TaskTrackerStatus.ResourceStatus
Class representing a collection of resources on this tasktracker.
|
TaskUmbilicalProtocol
Protocol that task child process uses to contact its parent process.
|
TIPStatus
The states of a
TaskInProgress as seen by the JobTracker. |
Utils.OutputFileUtils.OutputLogFilter
This class filters log files from directory given
It doesnt accept paths having _logs.
|
Class and Description |
---|
JobClient
JobClient is the primary interface for the user-job to interact
with the JobTracker . |
JobConf
A map/reduce job configuration.
|
JobID
JobID represents the immutable and unique identifier for
the job.
|
Class and Description |
---|
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobConf
A map/reduce job configuration.
|
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
FileInputFormat
A base class for file-based
InputFormat . |
FileOutputFormat
A base class for
OutputFormat . |
FileSplit
A section of an input file.
|
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapReduceBase |
MapRunnable
Expert: Generic interface for
Mapper s. |
OutputCollector |
OutputFormat
OutputFormat describes the output-specification for a
Map-Reduce job. |
Partitioner
Partitions the key space.
|
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
RecordWriter
RecordWriter writes the output <key, value> pairs
to an output file. |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
OutputCollector |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
OutputFormat
OutputFormat describes the output-specification for a
Map-Reduce job. |
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
RecordWriter
RecordWriter writes the output <key, value> pairs
to an output file. |
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
Class and Description |
---|
JobConf
A map/reduce job configuration.
|
RunningJob
RunningJob is the user-interface to query for details on a
running Map-Reduce job. |
Class and Description |
---|
ID
A general identifier, which internally stores the id
as an integer.
|
JobClient
JobClient is the primary interface for the user-job to interact
with the JobTracker . |
TaskCompletionEvent
This is used to track task completion events on
job tracker.
|
Class and Description |
---|
JobInProgress
JobInProgress maintains all the info for keeping
a Job on the straight and narrow.
|
JobTracker
JobTracker is the central location for submitting and
tracking MR jobs in a network environment.
|
TaskTrackerStatus
A TaskTrackerStatus is a MapReduce primitive.
|
Class and Description |
---|
Task
Base class for tasks.
|
Class and Description |
---|
TaskController
Controls initialization, finalization and clean up of tasks, and
also the launching and killing of task JVMs.
|
UserLogCleaner
This is used only in UserLogManager, to manage cleanup of user logs.
|
Class and Description |
---|
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
Class and Description |
---|
RawKeyValueIterator
RawKeyValueIterator is an iterator used to iterate over
the raw keys and values during sort/merge of intermediate data. |
Class and Description |
---|
FileInputFormat
A base class for file-based
InputFormat . |
FileSplit
A section of an input file.
|
InputFormat
InputFormat describes the input-specification for a
Map-Reduce job. |
InputSplit
InputSplit represents the data to be processed by an
individual Mapper . |
JobClient
JobClient is the primary interface for the user-job to interact
with the JobTracker . |
JobConf
A map/reduce job configuration.
|
JobConfigurable
That what may be configured.
|
JobID
JobID represents the immutable and unique identifier for
the job.
|
KeyValueTextInputFormat
An
InputFormat for plain text files. |
Mapper
Maps input key/value pairs to a set of intermediate key/value pairs.
|
MapRunnable
Expert: Generic interface for
Mapper s. |
MapRunner
Default
MapRunnable implementation. |
OutputCollector |
RecordReader
RecordReader reads <key, value> pairs from an
InputSplit . |
Reducer
Reduces a set of intermediate values which share a key to a smaller set of
values.
|
Reporter
A facility for Map-Reduce applications to report progress and update
counters, status information etc.
|
RunningJob
RunningJob is the user-interface to query for details on a
running Map-Reduce job. |
Class and Description |
---|
JobConf
A map/reduce job configuration.
|
TaskController
Controls initialization, finalization and clean up of tasks, and
also the launching and killing of task JVMs.
|
Copyright © 2009 The Apache Software Foundation