HADOOP-9619 Mark stability of .proto files (sanjay Radia)

git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/trunk@1495564 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Sanjay Radia 2013-06-21 19:53:21 +00:00
parent b194cbc6dd
commit 6cb5ad16d0
29 changed files with 208 additions and 23 deletions

View File

@ -409,6 +409,8 @@ Release 2.1.0-beta - UNRELEASED
HADOOP-8608. Add Configuration API for parsing time durations. (cdouglas) HADOOP-8608. Add Configuration API for parsing time durations. (cdouglas)
HADOOP-9619 Mark stability of .proto files (sanjay Radia)
OPTIMIZATIONS OPTIMIZATIONS
HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs HADOOP-9150. Avoid unnecessary DNS resolution attempts for logical URIs

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.ha.proto"; option java_package = "org.apache.hadoop.ha.proto";
option java_outer_classname = "HAServiceProtocolProtos"; option java_outer_classname = "HAServiceProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -15,6 +15,13 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.ipc.protobuf"; option java_package = "org.apache.hadoop.ipc.protobuf";
option java_outer_classname = "IpcConnectionContextProtos"; option java_outer_classname = "IpcConnectionContextProtos";
option java_generate_equals_and_hash = true; option java_generate_equals_and_hash = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
/** /**
* These are the messages used by Hadoop RPC for the Rpc Engine Protocol Buffer * These are the messages used by Hadoop RPC for the Rpc Engine Protocol Buffer
* to marshal the request and response in the RPC layer. * to marshal the request and response in the RPC layer.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.ipc.protobuf"; option java_package = "org.apache.hadoop.ipc.protobuf";
option java_outer_classname = "ProtocolInfoProtos"; option java_outer_classname = "ProtocolInfoProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -15,6 +15,13 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.ipc.protobuf"; option java_package = "org.apache.hadoop.ipc.protobuf";
option java_outer_classname = "RpcHeaderProtos"; option java_outer_classname = "RpcHeaderProtos";
option java_generate_equals_and_hash = true; option java_generate_equals_and_hash = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.security.proto"; option java_package = "org.apache.hadoop.security.proto";
option java_outer_classname = "SecurityProtos"; option java_outer_classname = "SecurityProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.ha.proto"; option java_package = "org.apache.hadoop.ha.proto";
option java_outer_classname = "ZKFCProtocolProtos"; option java_outer_classname = "ZKFCProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -98,9 +98,8 @@ hand-in-hand to address this.
Wire compatibility concerns data being transmitted over the wire Wire compatibility concerns data being transmitted over the wire
between Hadoop processes. Hadoop uses Protocol Buffers for most RPC between Hadoop processes. Hadoop uses Protocol Buffers for most RPC
communication. Preserving compatibility requires prohibiting communication. Preserving compatibility requires prohibiting
modification to the required fields of the corresponding protocol modification as described below.
buffer. Optional fields may be added without breaking backwards Non-RPC communication should be considered as well,
compatibility. Non-RPC communication should be considered as well,
for example using HTTP to transfer an HDFS image as part of for example using HTTP to transfer an HDFS image as part of
snapshotting or transferring MapTask output. The potential snapshotting or transferring MapTask output. The potential
communications can be categorized as follows: communications can be categorized as follows:
@ -131,7 +130,7 @@ hand-in-hand to address this.
* Server-Server compatibility is required to allow mixed versions * Server-Server compatibility is required to allow mixed versions
within an active cluster so the cluster may be upgraded without within an active cluster so the cluster may be upgraded without
downtime. downtime in a rolling fashion.
*** Policy *** Policy
@ -139,37 +138,56 @@ hand-in-hand to address this.
major release. (Different policies for different categories are yet to be major release. (Different policies for different categories are yet to be
considered.) considered.)
* The source files generated from the proto files need to be * Compatibility can be broken only at a major release, though breaking compatibility
compatible within a major release to facilitate rolling even at major releases has grave consequences and should be discussed in the Hadoop community.
upgrades. The proto files are governed by the following:
* The following changes are NEVER allowed: * Hadoop protocols are defined in .proto (ProtocolBuffers) files.
Client-Server protocols and Server-protocol .proto files are marked as stable.
When a .proto file is marked as stable it means that changes should be made
in a compatible fashion as described below:
* Change a field id. * The following changes are compatible and are allowed at any time:
* Reuse an old field that was previously deleted. Field numbers are * Add an optional field, with the expectation that the code deals with the field missing due to communication with an older version of the code.
cheap and changing and reusing is not a good idea.
* The following changes cannot be made to a stable .proto except at a * Add a new rpc/method to the service
major release:
* Add a new optional request to a Message
* Rename a field
* Rename a .proto file
* Change .proto annotations that effect code generation (e.g. name of java package)
* The following changes are incompatible but can be considered only at a major release
* Change the rpc/method name
* Change the rpc/method parameter type or return type
* Remove an rpc/method
* Change the service name
* Change the name of a Message
* Modify a field type in an incompatible way (as defined recursively) * Modify a field type in an incompatible way (as defined recursively)
* Add or delete a required field * Change an optional field to required
* Delete an optional field * Add or delete a required field
* The following changes are allowed at any time: * Delete an optional field as long as the optional field has reasonable defaults to allow deletions
* Add an optional field, but ensure the code allows communication with prior
version of the client code which did not have that field.
* Rename a field * The following changes are incompatible and hence never allowed
* Rename a .proto file * Change a field id
* Reuse an old field that was previously deleted.
* Field numbers are cheap and changing and reusing is not a good idea.
* Change .proto annotations that effect code generation (e.g. name of
java package)
** Java Binary compatibility for end-user applications i.e. Apache Hadoop ABI ** Java Binary compatibility for end-user applications i.e. Apache Hadoop ABI

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used throughout HDFS -- i.e. // This file contains protocol buffers that are used throughout HDFS -- i.e.
// by the client, server, and data transfer protocols. // by the client, server, and data transfer protocols.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.hdfs.protocol.proto"; option java_package = "org.apache.hadoop.hdfs.protocol.proto";
option java_outer_classname = "ClientNamenodeProtocolProtos"; option java_outer_classname = "ClientNamenodeProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used throughout HDFS -- i.e. // This file contains protocol buffers that are used throughout HDFS -- i.e.
// by the client, server, and data transfer protocols. // by the client, server, and data transfer protocols.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.hdfs.protocol.proto"; option java_package = "org.apache.hadoop.hdfs.protocol.proto";
option java_outer_classname = "GetUserMappingsProtocolProtos"; option java_outer_classname = "GetUserMappingsProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -15,6 +15,13 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.hdfs.server.namenode.ha.proto"; option java_package = "org.apache.hadoop.hdfs.server.namenode.ha.proto";
option java_outer_classname = "HAZKInfoProtos"; option java_outer_classname = "HAZKInfoProtos";
package hadoop.hdfs; package hadoop.hdfs;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used throughout HDFS -- i.e. // This file contains protocol buffers that are used throughout HDFS -- i.e.
// by the client, server, and data transfer protocols. // by the client, server, and data transfer protocols.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used throughout HDFS -- i.e. // This file contains protocol buffers that are used throughout HDFS -- i.e.
// by the client, server, and data transfer protocols. // by the client, server, and data transfer protocols.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used throughout HDFS -- i.e. // This file contains protocol buffers that are used throughout HDFS -- i.e.
// by the client, server, and data transfer protocols. // by the client, server, and data transfer protocols.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.hdfs.qjournal.protocol"; option java_package = "org.apache.hadoop.hdfs.qjournal.protocol";
option java_outer_classname = "QJournalProtocolProtos"; option java_outer_classname = "QJournalProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.hdfs.protocol.proto"; option java_package = "org.apache.hadoop.hdfs.protocol.proto";
option java_outer_classname = "RefreshAuthorizationPolicyProtocolProtos"; option java_outer_classname = "RefreshAuthorizationPolicyProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.hdfs.protocol.proto"; option java_package = "org.apache.hadoop.hdfs.protocol.proto";
option java_outer_classname = "RefreshUserMappingsProtocolProtos"; option java_outer_classname = "RefreshUserMappingsProtocolProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used to transfer data // This file contains protocol buffers that are used to transfer data
// to and from the datanode, as well as between datanodes. // to and from the datanode, as well as between datanodes.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are private and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
// This file contains protocol buffers that are used throughout HDFS -- i.e. // This file contains protocol buffers that are used throughout HDFS -- i.e.
// by the client, server, and data transfer protocols. // by the client, server, and data transfer protocols.

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "ApplicationClientProtocol"; option java_outer_classname = "ApplicationClientProtocol";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "ApplicationMasterProtocol"; option java_outer_classname = "ApplicationMasterProtocol";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "ContainerManagementProtocol"; option java_outer_classname = "ContainerManagementProtocol";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "ResourceManagerAdministrationProtocol"; option java_outer_classname = "ResourceManagerAdministrationProtocol";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "YarnProtos"; option java_outer_classname = "YarnProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "YarnServerResourceManagerServiceProtos"; option java_outer_classname = "YarnServerResourceManagerServiceProtos";
option java_generic_services = true; option java_generic_services = true;

View File

@ -16,6 +16,12 @@
* limitations under the License. * limitations under the License.
*/ */
/**
* These .proto interfaces are public and stable.
* Please see http://wiki.apache.org/hadoop/Compatibility
* for what changes are allowed for a *stable* .proto interface.
*/
option java_package = "org.apache.hadoop.yarn.proto"; option java_package = "org.apache.hadoop.yarn.proto";
option java_outer_classname = "YarnServiceProtos"; option java_outer_classname = "YarnServiceProtos";
option java_generic_services = true; option java_generic_services = true;