HBASE-487 Replace hql w/ a hbase-friendly jirb or jython shell, Part 1: Purge hql, added raw jirb

git-svn-id: https://svn.apache.org/repos/asf/hadoop/hbase/trunk@659445 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Michael Stack 2008-05-23 06:21:16 +00:00
parent 423fca2f16
commit cb9d586513
51 changed files with 96 additions and 8173 deletions

View File

@ -7,6 +7,8 @@ Hbase Change Log
compatible anyways
HBASE-82 Row keys should be array of bytes
HBASE-76 Purge servers of Text (Done as part of HBASE-82 commit).
HBASE-487 Replace hql w/ a hbase-friendly jirb or jython shell
Part 1: purge of hql and added raw jirb in its place.
BUG FIXES
HBASE-574 HBase does not load hadoop native libs (Rong-En Fan via Stack)

View File

@ -183,7 +183,7 @@ unset IFS
# figure out which class to run
if [ "$COMMAND" = "shell" ] ; then
CLASS='org.apache.hadoop.hbase.Shell'
CLASS='org.jruby.Main --command irb'
elif [ "$COMMAND" = "master" ] ; then
CLASS='org.apache.hadoop.hbase.master.HMaster'
elif [ "$COMMAND" = "regionserver" ] ; then

View File

@ -125,19 +125,7 @@
</exec>
</target>
<target name="javacc" if="javacc.home">
<echo message="javacc.home: ${javacc.home}"/>
<property name="hql.src.dir"
value="${src.dir}/org/apache/hadoop/hbase/hql" />
<mkdir dir="${hql.src.dir}/generated" />
<javacc
target="${hql.src.dir}/HQLParser.jj"
outputdirectory="${hql.src.dir}/generated"
javacchome="${javacc.home}"
/>
</target>
<target name="compile" depends="init,javacc,jspc">
<target name="compile" depends="init,jspc">
<!--Compile whats under src and generated java classes made from jsp-->
<javac
encoding="${build.encoding}"

View File

@ -282,12 +282,6 @@
if true, enable audible keyboard bells if an alert is required.
</description>
</property>
<property>
<name>hbaseshell.formatter</name>
<value>org.apache.hadoop.hbase.hql.formatter.AsciiTableFormatter</value>
<description>TableFormatter to use outputting HQL result sets.
</description>
</property>
<property>
<name>hbase.regionserver.globalMemcacheLimit</name>
<value>536870912</value>

View File

@ -23,6 +23,7 @@
# The java implementation to use. Required.
# export JAVA_HOME=/usr/lib/j2sdk1.5-sun
export JAVA_HOME=/usr
# Extra Java CLASSPATH elements. Optional.
# export HBASE_CLASSPATH=

Binary file not shown.

View File

@ -0,0 +1,86 @@
Common Public License - v 1.0
THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS COMMON PUBLIC LICENSE ("AGREEMENT"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT.
1. DEFINITIONS
"Contribution" means:
a) in the case of the initial Contributor, the initial code and documentation distributed under this Agreement, and
b) in the case of each subsequent Contributor:
i) changes to the Program, and
ii) additions to the Program;
where such changes and/or additions to the Program originate from and are distributed by that particular Contributor. A Contribution 'originates' from a Contributor if it was added to the Program by such Contributor itself or anyone acting on such Contributor's behalf. Contributions do not include additions to the Program which: (i) are separate modules of software distributed in conjunction with the Program under their own license agreement, and (ii) are not derivative works of the Program.
"Contributor" means any person or entity that distributes the Program.
"Licensed Patents " mean patent claims licensable by a Contributor which are necessarily infringed by the use or sale of its Contribution alone or when combined with the Program.
"Program" means the Contributions distributed in accordance with this Agreement.
"Recipient" means anyone who receives the Program under this Agreement, including all Contributors.
2. GRANT OF RIGHTS
a) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, distribute and sublicense the Contribution of such Contributor, if any, and such derivative works, in source code and object code form.
b) Subject to the terms of this Agreement, each Contributor hereby grants Recipient a non-exclusive, worldwide, royalty-free patent license under Licensed Patents to make, use, sell, offer to sell, import and otherwise transfer the Contribution of such Contributor, if any, in source code and object code form. This patent license shall apply to the combination of the Contribution and the Program if, at the time the Contribution is added by the Contributor, such addition of the Contribution causes such combination to be covered by the Licensed Patents. The patent license shall not apply to any other combinations which include the Contribution. No hardware per se is licensed hereunder.
c) Recipient understands that although each Contributor grants the licenses to its Contributions set forth herein, no assurances are provided by any Contributor that the Program does not infringe the patent or other intellectual property rights of any other entity. Each Contributor disclaims any liability to Recipient for claims brought by any other entity based on infringement of intellectual property rights or otherwise. As a condition to exercising the rights and licenses granted hereunder, each Recipient hereby assumes sole responsibility to secure any other intellectual property rights needed, if any. For example, if a third party patent license is required to allow Recipient to distribute the Program, it is Recipient's responsibility to acquire that license before distributing the Program.
d) Each Contributor represents that to its knowledge it has sufficient copyright rights in its Contribution, if any, to grant the copyright license set forth in this Agreement.
3. REQUIREMENTS
A Contributor may choose to distribute the Program in object code form under its own license agreement, provided that:
a) it complies with the terms and conditions of this Agreement; and
b) its license agreement:
i) effectively disclaims on behalf of all Contributors all warranties and conditions, express and implied, including warranties or conditions of title and non-infringement, and implied warranties or conditions of merchantability and fitness for a particular purpose;
ii) effectively excludes on behalf of all Contributors all liability for damages, including direct, indirect, special, incidental and consequential damages, such as lost profits;
iii) states that any provisions which differ from this Agreement are offered by that Contributor alone and not by any other party; and
iv) states that source code for the Program is available from such Contributor, and informs licensees how to obtain it in a reasonable manner on or through a medium customarily used for software exchange.
When the Program is made available in source code form:
a) it must be made available under this Agreement; and
b) a copy of this Agreement must be included with each copy of the Program.
Contributors may not remove or alter any copyright notices contained within the Program.
Each Contributor must identify itself as the originator of its Contribution, if any, in a manner that reasonably allows subsequent Recipients to identify the originator of the Contribution.
4. COMMERCIAL DISTRIBUTION
Commercial distributors of software may accept certain responsibilities with respect to end users, business partners and the like. While this license is intended to facilitate the commercial use of the Program, the Contributor who includes the Program in a commercial product offering should do so in a manner which does not create potential liability for other Contributors. Therefore, if a Contributor includes the Program in a commercial product offering, such Contributor ("Commercial Contributor") hereby agrees to defend and indemnify every other Contributor ("Indemnified Contributor") against any losses, damages and costs (collectively "Losses") arising from claims, lawsuits and other legal actions brought by a third party against the Indemnified Contributor to the extent caused by the acts or omissions of such Commercial Contributor in connection with its distribution of the Program in a commercial product offering. The obligations in this section do not apply to any claims or Losses relating to any actual or alleged intellectual property infringement. In order to qualify, an Indemnified Contributor must: a) promptly notify the Commercial Contributor in writing of such claim, and b) allow the Commercial Contributor to control, and cooperate with the Commercial Contributor in, the defense and any related settlement negotiations. The Indemnified Contributor may participate in any such claim at its own expense.
For example, a Contributor might include the Program in a commercial product offering, Product X. That Contributor is then a Commercial Contributor. If that Commercial Contributor then makes performance claims, or offers warranties related to Product X, those performance claims and warranties are such Commercial Contributor's responsibility alone. Under this section, the Commercial Contributor would have to defend claims against the other Contributors related to those performance claims and warranties, and if a court requires any other Contributor to pay any damages as a result, the Commercial Contributor must pay those damages.
5. NO WARRANTY
EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the appropriateness of using and distributing the Program and assumes all risks associated with its exercise of rights under this Agreement, including but not limited to the risks and costs of program errors, compliance with applicable laws, damage to or loss of data, programs or equipment, and unavailability or interruption of operations.
6. DISCLAIMER OF LIABILITY
EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
7. GENERAL
If any provision of this Agreement is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this Agreement, and without further action by the parties hereto, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable.
If Recipient institutes patent litigation against a Contributor with respect to a patent applicable to software (including a cross-claim or counterclaim in a lawsuit), then any patent licenses granted by that Contributor to such Recipient under this Agreement shall terminate as of the date such litigation is filed. In addition, if Recipient institutes patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Program itself (excluding combinations of the Program with other software or hardware) infringes such Recipient's patent(s), then such Recipient's rights granted under Section 2(b) shall terminate as of the date such litigation is filed.
All Recipient's rights under this Agreement shall terminate if it fails to comply with any of the material terms or conditions of this Agreement and does not cure such failure in a reasonable period of time after becoming aware of such noncompliance. If all Recipient's rights under this Agreement terminate, Recipient agrees to cease use and distribution of the Program as soon as reasonably practicable. However, Recipient's obligations under this Agreement and any licenses granted by Recipient relating to the Program shall continue and survive.
Everyone is permitted to copy and distribute copies of this Agreement, but in order to avoid inconsistency the Agreement is copyrighted and may only be modified in the following manner. The Agreement Steward reserves the right to publish new versions (including revisions) of this Agreement from time to time. No one other than the Agreement Steward has the right to modify this Agreement. IBM is the initial Agreement Steward. IBM may assign the responsibility to serve as the Agreement Steward to a suitable separate entity. Each new version of the Agreement will be given a distinguishing version number. The Program (including Contributions) may always be distributed subject to the version of the Agreement under which it was received. In addition, after a new version of the Agreement is published, Contributor may elect to distribute the Program (including its Contributions) under the new version. Except as expressly stated in Sections 2(a) and 2(b) above, Recipient receives no rights or licenses to the intellectual property of any Contributor under this Agreement, whether expressly, by implication, estoppel or otherwise. All rights in the Program not expressly granted under this Agreement are reserved.
This Agreement is governed by the laws of the State of New York and the intellectual property laws of the United States of America. No party to this Agreement will bring a legal action under this Agreement more than one year after the cause of action arose. Each party waives its rights to a jury trial in any resulting litigation.

Binary file not shown.

View File

@ -1,143 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.Writer;
import jline.ConsoleReader;
import org.apache.hadoop.hbase.hql.Constants;
import org.apache.hadoop.hbase.hql.HQLClient;
import org.apache.hadoop.hbase.hql.HQLSecurityManager;
import org.apache.hadoop.hbase.hql.HelpCommand;
import org.apache.hadoop.hbase.hql.ReturnMsg;
import org.apache.hadoop.hbase.hql.TableFormatter;
import org.apache.hadoop.hbase.hql.TableFormatterFactory;
import org.apache.hadoop.hbase.hql.formatter.HtmlTableFormatter;
/**
* An hbase shell.
*
* @see <a
* href="http://wiki.apache.org/lucene-hadoop/Hbase/HbaseShell">HbaseShell</a>
*/
public class Shell {
/** audible keyboard bells */
public static final boolean DEFAULT_BELL_ENABLED = true;
public static String IP = null;
public static int PORT = -1;
public static String HTML_OPTION = null;
/** Return the boolean value indicating whether end of command or not */
static boolean isEndOfCommand(String line) {
return (line.lastIndexOf(';') > -1) ? true : false;
}
/** Return the string of prompt start string */
private static String getPrompt(final StringBuilder queryStr) {
return (queryStr.toString().equals("")) ? "hql > " : " --> ";
}
/**
* @param watch true if execution time should be computed and returned
* @param start start of time interval
* @param end end of time interval
* @return a string of code execution time.
*/
public static String executeTime(boolean watch, long start, long end) {
return watch ? " ("
+ String.format("%.2f", Double.valueOf((end - start) * 0.001)) + " sec)"
: "";
}
/**
* Main method
*
* @param args not used
* @throws IOException
*/
public static void main(String args[]) throws IOException {
argumentParsing(args);
if (args.length != 0) {
if (args[0].equals("--help") || args[0].equals("-h")) {
System.out
.println("Usage: ./bin/hbase shell [--master:master_address:port] [--html]\n");
System.exit(1);
}
}
HBaseConfiguration conf = new HBaseConfiguration();
ConsoleReader reader = new ConsoleReader();
System.setSecurityManager(new HQLSecurityManager());
reader.setBellEnabled(conf.getBoolean("hbaseshell.jline.bell.enabled",
DEFAULT_BELL_ENABLED));
Writer out = new OutputStreamWriter(System.out, "UTF-8");
TableFormatter tableFormatter = new TableFormatterFactory(out, conf).get();
if (HTML_OPTION != null) {
tableFormatter = new HtmlTableFormatter(out);
}
HelpCommand help = new HelpCommand(out, tableFormatter);
if (args.length == 0 || !args[0].equals(String.valueOf(Constants.FLAG_RELAUNCH))) {
help.printVersion();
}
StringBuilder queryStr = new StringBuilder();
String extendedLine;
HQLClient hql = new HQLClient(conf, IP, PORT, out, tableFormatter);
while ((extendedLine = reader.readLine(getPrompt(queryStr))) != null) {
if (isEndOfCommand(extendedLine)) {
queryStr.append(" " + extendedLine);
long start = System.currentTimeMillis();
ReturnMsg rs = hql.executeQuery(queryStr.toString());
long end = System.currentTimeMillis();
if (rs != null) {
if (rs != null && rs.getType() > Constants.ERROR_CODE)
System.out.println(rs.getMsg() +
executeTime((rs.getType() == 1), start, end));
else if (rs.getType() == Constants.ERROR_CODE)
System.out.println(rs.getMsg());
}
queryStr = new StringBuilder();
} else {
queryStr.append(" " + extendedLine);
}
}
System.out.println();
}
private static void argumentParsing(String[] args) {
for (int i = 0; i < args.length; i++) {
if (args[i].toLowerCase().startsWith("--master:")) {
String[] address = args[i].substring(9, args[i].length()).split(":");
IP = address[0];
PORT = Integer.valueOf(address[1]);
} else if (args[i].toLowerCase().startsWith("--html")) {
HTML_OPTION = args[i];
}
}
}
}

View File

@ -1,257 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.client.HConnection;
import org.apache.hadoop.hbase.client.HConnectionManager;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.hbase.BloomFilterDescriptor;
import org.apache.hadoop.hbase.BloomFilterDescriptor.BloomFilterType;
/**
* Alters tables.
*/
public class AlterCommand extends SchemaModificationCommand {
public enum OperationType {
ADD, DROP, CHANGE, NOOP
}
private OperationType operationType = OperationType.NOOP;
private Map<String, Map<String, Object>> columnSpecMap = new HashMap<String, Map<String, Object>>();
private String tableName;
private String column; // column to be dropped
public AlterCommand(Writer o) {
super(o);
}
@SuppressWarnings("unchecked")
public ReturnMsg execute(HBaseConfiguration conf) {
try {
HConnection conn = HConnectionManager.getConnection(conf);
if (!conn.tableExists(Bytes.toBytes(this.tableName))) {
return new ReturnMsg(0, "'" + this.tableName + "'" + TABLE_NOT_FOUND);
}
HBaseAdmin admin = new HBaseAdmin(conf);
Set<String> columns = null;
HColumnDescriptor columnDesc = null;
switch (operationType) {
case ADD:
disableTable(admin, tableName);
columns = columnSpecMap.keySet();
for (String c : columns) {
columnDesc = getColumnDescriptor(c, columnSpecMap.get(c));
println("Adding " + c + " to " + tableName + "... Please wait.");
admin.addColumn(new Text(tableName), columnDesc);
}
enableTable(admin, tableName);
break;
case DROP:
disableTable(admin, tableName);
println("Dropping " + column + " from " + tableName + "... Please wait.");
column = appendDelimiter(column);
admin.deleteColumn(new Text(tableName), new Text(column));
enableTable(admin, tableName);
break;
case CHANGE:
disableTable(admin, tableName);
Map.Entry<String, Map<String, Object>> columnEntry = (Map.Entry<String, Map<String, Object>>) columnSpecMap
.entrySet().toArray()[0];
// add the : if there isn't one
Text columnName = new Text(
columnEntry.getKey().endsWith(":") ? columnEntry.getKey()
: columnEntry.getKey() + ":");
// get the table descriptor so we can get the old column descriptor
HTableDescriptor tDesc = getTableDescByName(admin, tableName);
HColumnDescriptor oldColumnDesc = tDesc.getFamily(columnName.getBytes());
// combine the options specified in the shell with the options
// from the exiting descriptor to produce the new descriptor
columnDesc = getColumnDescriptor(columnName.toString(), columnEntry
.getValue(), oldColumnDesc);
// send the changes out to the master
admin.modifyColumn(new Text(tableName), columnName, columnDesc);
enableTable(admin, tableName);
break;
case NOOP:
return new ReturnMsg(0, "Invalid operation type.");
}
return new ReturnMsg(0, "Table altered successfully.");
} catch (Exception e) {
return new ReturnMsg(0, extractErrMsg(e));
}
}
private void disableTable(HBaseAdmin admin, String t) throws IOException {
println("Disabling " + t + "... Please wait.");
admin.disableTable(new Text(t));
}
private void enableTable(HBaseAdmin admin, String t) throws IOException {
println("Enabling " + t + "... Please wait.");
admin.enableTable(new Text(t));
}
/**
* Sets the table to be altered.
*
* @param t Table to be altered.
*/
public void setTable(String t) {
this.tableName = t;
}
/**
* Adds a column specification.
*
* @param columnSpec Column specification
*/
public void addColumnSpec(String c, Map<String, Object> columnSpec) {
columnSpecMap.put(c, columnSpec);
}
/**
* Sets the column to be dropped. Only applicable to the DROP operation.
*
* @param c Column to be dropped.
*/
public void setColumn(String c) {
this.column = c;
}
/**
* Sets the operation type of this alteration.
*
* @param operationType Operation type
* @see OperationType
*/
public void setOperationType(OperationType operationType) {
this.operationType = operationType;
}
@Override
public CommandType getCommandType() {
return CommandType.DDL;
}
private HTableDescriptor getTableDescByName(HBaseAdmin admin, String tn)
throws IOException {
HTableDescriptor[] tables = admin.listTables();
for (HTableDescriptor tDesc : tables) {
if (tDesc.getName().toString().equals(tn)) {
return tDesc;
}
}
return null;
}
/**
* Given a column name, column spec, and original descriptor, returns an
* instance of HColumnDescriptor representing the column spec, with empty
* values drawn from the original as defaults
*/
protected HColumnDescriptor getColumnDescriptor(String c,
Map<String, Object> columnSpec, HColumnDescriptor original)
throws IllegalArgumentException {
initOptions(original);
Set<String> specs = columnSpec.keySet();
for (String spec : specs) {
spec = spec.toUpperCase();
if (spec.equals("MAX_VERSIONS")) {
maxVersions = (Integer) columnSpec.get(spec);
} else if (spec.equals("MAX_LENGTH")) {
maxLength = (Integer) columnSpec.get(spec);
} else if (spec.equals("COMPRESSION")) {
compression = HColumnDescriptor.CompressionType.valueOf(((String) columnSpec
.get(spec)).toUpperCase());
} else if (spec.equals("IN_MEMORY")) {
inMemory = (Boolean) columnSpec.get(spec);
} else if (spec.equals("BLOCK_CACHE_ENABLED")) {
blockCacheEnabled = (Boolean) columnSpec.get(spec);
} else if (spec.equals("BLOOMFILTER")) {
bloomFilterType = BloomFilterType.valueOf(((String) columnSpec.get(spec))
.toUpperCase());
} else if (spec.equals("VECTOR_SIZE")) {
vectorSize = (Integer) columnSpec.get(spec);
} else if (spec.equals("NUM_HASH")) {
numHash = (Integer) columnSpec.get(spec);
} else if (spec.equals("NUM_ENTRIES")) {
numEntries = (Integer) columnSpec.get(spec);
} else if (spec.equals("TTL")) {
timeToLive = (Integer) columnSpec.get(spec);
} else {
throw new IllegalArgumentException("Invalid option: " + spec);
}
}
// Now we gather all the specified options for this column.
if (bloomFilterType != null) {
if (specs.contains("NUM_ENTRIES")) {
bloomFilterDesc = new BloomFilterDescriptor(bloomFilterType, numEntries);
} else {
bloomFilterDesc = new BloomFilterDescriptor(bloomFilterType, vectorSize,
numHash);
}
}
c = appendDelimiter(c);
HColumnDescriptor columnDesc =
new HColumnDescriptor(Bytes.toBytes(c),
maxVersions, compression, inMemory, blockCacheEnabled,
maxLength, timeToLive, bloomFilterDesc);
return columnDesc;
}
private void initOptions(HColumnDescriptor original) {
if (original == null) {
initOptions();
return;
}
maxVersions = original.getMaxVersions();
maxLength = original.getMaxValueLength();
compression = original.getCompression();
inMemory = original.isInMemory();
blockCacheEnabled = original.isBlockCacheEnabled();
bloomFilterDesc = original.getBloomFilter();
timeToLive = original.getTimeToLive();
}
}

View File

@ -1,100 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
/**
* Takes the lowest-common-denominator {@link Writer} doing its own printlns,
* etc.
*
* @see <a
* href="http://wiki.apache.org/lucene-hadoop/Hbase/HbaseShell">HBaseShell</a>
*/
public abstract class BasicCommand implements Command, CommandFactory {
private final Writer out;
public final String LINE_SEPARATOR = System.getProperty("line.separator");
public final String TABLE_NOT_FOUND = " is non-existant table.";
// Shutdown constructor.
@SuppressWarnings("unused")
private BasicCommand() {
this(null);
}
/**
* Constructor
*
* @param o A Writer.
*/
public BasicCommand(final Writer o) {
this.out = o;
}
public BasicCommand getBasicCommand() {
return this;
}
/** basic commands are their own factories. */
public Command getCommand() {
return this;
}
protected String extractErrMsg(String msg) {
int index = msg.indexOf(":");
int eofIndex = msg.indexOf("\n");
return msg.substring(index + 1, eofIndex);
}
protected String extractErrMsg(Exception e) {
return extractErrMsg(e.getMessage());
}
/**
* Appends, if it does not exist, a delimiter (colon) at the end of the column
* name.
*/
protected String appendDelimiter(String column) {
return (!column.endsWith(FAMILY_INDICATOR) && column
.indexOf(FAMILY_INDICATOR) == -1) ? column + FAMILY_INDICATOR : column;
}
/**
* @return Writer to use outputting.
*/
public Writer getOut() {
return this.out;
}
public void print(final String msg) throws IOException {
this.out.write(msg);
}
public void println(final String msg) throws IOException {
print(msg);
print(LINE_SEPARATOR);
this.out.flush();
}
public CommandType getCommandType() {
return CommandType.SELECT;
}
}

View File

@ -1,62 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.HBaseConfiguration;
/**
* Clears the console screen.
*/
public class ClearCommand extends BasicCommand {
public ClearCommand(Writer o) {
super(o);
}
public ReturnMsg execute(@SuppressWarnings("unused")
HBaseConfiguration conf) {
clear();
return null;
}
private void clear() {
String osName = System.getProperty("os.name");
if (osName.length() > 7 && osName.subSequence(0, 7).equals("Windows")) {
try {
Runtime.getRuntime().exec("cmd /C cls");
} catch (IOException e) {
try {
println("Can't clear." + e.toString());
} catch (IOException e1) {
e1.printStackTrace();
}
}
} else {
System.out.print("\033c");
}
}
@Override
public CommandType getCommandType() {
return CommandType.SHELL;
}
}

View File

@ -1,45 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import org.apache.hadoop.hbase.HBaseConfiguration;
public interface Command {
/** family indicator */
public static final String FAMILY_INDICATOR = ":";
public enum CommandType {
DDL, UPDATE, SELECT, INSERT, DELETE, SHELL
}
/**
* Execute a command
*
* @param conf Configuration
* @return Result of command execution
*/
public ReturnMsg execute(final HBaseConfiguration conf);
/**
* @return Type of this command whether DDL, SELECT, INSERT, UPDATE, DELETE,
* or SHELL.
*/
public CommandType getCommandType();
}

View File

@ -1,27 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
/**
* Parser uses command factories to create command.
*/
public interface CommandFactory {
Command getCommand();
}

View File

@ -1,30 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
/**
* Some constants used in the hql.
*/
public class Constants {
public static final int FLAG_RELAUNCH = 7;
public static final int FLAG_EXIT = 9999;
public static final int ERROR_CODE = -1;
}

View File

@ -1,90 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.Writer;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.io.Text;
/**
* Creates tables.
*/
public class CreateCommand extends SchemaModificationCommand {
private Text tableName;
private Map<String, Map<String, Object>> columnSpecMap = new HashMap<String, Map<String, Object>>();
public CreateCommand(Writer o) {
super(o);
}
public ReturnMsg execute(HBaseConfiguration conf) {
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (admin.tableExists(tableName)) {
return new ReturnMsg(0, "'" + tableName + "' table already exist.");
}
HTableDescriptor tableDesc = new HTableDescriptor(tableName.getBytes());
HColumnDescriptor columnDesc = null;
Set<String> columns = columnSpecMap.keySet();
for (String column : columns) {
columnDesc = getColumnDescriptor(column, columnSpecMap.get(column));
tableDesc.addFamily(columnDesc);
}
println("Creating table... Please wait.");
admin.createTable(tableDesc);
return new ReturnMsg(0, "Table created successfully.");
} catch (Exception e) {
return new ReturnMsg(0, extractErrMsg(e));
}
}
/**
* Sets the table to be created.
*
* @param tableName Table to be created
*/
public void setTable(String tableName) {
this.tableName = new Text(tableName);
}
/**
* Adds a column specification.
*
* @param columnSpec Column specification
*/
public void addColumnSpec(String column, Map<String, Object> columnSpec) {
columnSpecMap.put(column, columnSpec);
}
@Override
public CommandType getCommandType() {
return CommandType.DDL;
}
}

View File

@ -1,130 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.io.BatchUpdate;
/**
* Deletes values from tables.
*/
public class DeleteCommand extends BasicCommand {
public DeleteCommand(Writer o) {
super(o);
}
private Text tableName;
private Text rowKey;
private List<String> columnList;
public ReturnMsg execute(HBaseConfiguration conf) {
if (columnList == null) {
throw new IllegalArgumentException("Column list is null");
}
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(tableName)) {
return new ReturnMsg(0, "'" + tableName + "'" + TABLE_NOT_FOUND);
}
HTable hTable = new HTable(conf, tableName);
if (rowKey != null) {
BatchUpdate bu = new BatchUpdate(rowKey.getBytes());
for (Text column : getColumnList(admin, hTable)) {
bu.delete(column.getBytes());
}
hTable.commit(bu);
} else {
admin.disableTable(tableName);
for (Text column : getColumnList(admin, hTable)) {
admin.deleteColumn(tableName, new Text(column));
}
admin.enableTable(tableName);
}
return new ReturnMsg(1, "Column(s) deleted successfully.");
} catch (IOException e) {
String[] msg = e.getMessage().split("[\n]");
return new ReturnMsg(0, msg[0]);
}
}
public void setTable(String tableName) {
this.tableName = new Text(tableName);
}
public void setRow(String row) {
this.rowKey = new Text(row);
}
/**
* Sets the column list.
*
* @param columnList
*/
public void setColumnList(List<String> columnList) {
this.columnList = columnList;
}
/**
* @param admin
* @param hTable
* @return return the column list.
*/
public Text[] getColumnList(HBaseAdmin admin, HTable hTable) {
Text[] columns = null;
try {
if (columnList.contains("*")) {
columns = hTable.getRow(new Text(this.rowKey)).keySet().toArray(
new Text[] {});
} else {
List<Text> tmpList = new ArrayList<Text>();
for (int i = 0; i < columnList.size(); i++) {
Text column = null;
if (columnList.get(i).contains(":"))
column = new Text(columnList.get(i));
else
column = new Text(columnList.get(i) + ":");
tmpList.add(column);
}
columns = tmpList.toArray(new Text[] {});
}
} catch (IOException e) {
e.printStackTrace();
}
return columns;
}
@Override
public CommandType getCommandType() {
return CommandType.DELETE;
}
}

View File

@ -1,88 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.io.Text;
/**
* Prints information about tables.
*/
public class DescCommand extends BasicCommand {
private static final String[] HEADER = new String[] { "Column Family Descriptor" };
private Text tableName;
private final TableFormatter formatter;
// Not instantiable
@SuppressWarnings("unused")
private DescCommand() {
this(null, null);
}
public DescCommand(final Writer o, final TableFormatter f) {
super(o);
this.formatter = f;
}
public ReturnMsg execute(final HBaseConfiguration conf) {
if (tableName == null)
return new ReturnMsg(0, "Syntax error : Please check 'Describe' syntax.");
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(tableName)) {
return new ReturnMsg(0, "Table not found.");
}
HTableDescriptor[] tables = admin.listTables();
HColumnDescriptor[] columns = null;
for (int i = 0; i < tables.length; i++) {
if (tables[i].getName().equals(tableName)) {
columns = tables[i].getFamilies().toArray(new HColumnDescriptor[] {});
break;
}
}
formatter.header(HEADER);
// Do a toString on the HColumnDescriptors
String[] columnStrs = new String[columns.length];
for (int i = 0; i < columns.length; i++) {
String tmp = columns[i].toString();
// Strip the curly-brackets if present.
if (tmp.length() > 2 && tmp.startsWith("{") && tmp.endsWith("}")) {
tmp = tmp.substring(1, tmp.length() - 1);
}
columnStrs[i] = tmp;
formatter.row(new String[] { columnStrs[i] });
}
formatter.footer();
return new ReturnMsg(1, columns.length + " columnfamily(s) in set.");
} catch (IOException e) {
return new ReturnMsg(0, "error msg : " + e.toString());
}
}
public void setArgument(String table) {
this.tableName = new Text(table);
}
}

View File

@ -1,66 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.io.Text;
/**
* Disables tables.
*/
public class DisableCommand extends BasicCommand {
private String tableName;
public DisableCommand(Writer o) {
super(o);
}
public ReturnMsg execute(HBaseConfiguration conf) {
assert tableName != null;
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(new Text(tableName))) {
return new ReturnMsg(0, "'" + tableName + "'" + TABLE_NOT_FOUND);
}
admin.disableTable(new Text(tableName));
return new ReturnMsg(1, "Table disabled successfully.");
} catch (IOException e) {
String[] msg = e.getMessage().split("[\n]");
return new ReturnMsg(0, msg[0]);
}
}
public void setTable(String table) {
this.tableName = table;
}
@Override
public CommandType getCommandType() {
return CommandType.DDL;
}
}

View File

@ -1,77 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import java.util.List;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.io.Text;
/**
* Drops tables.
*/
public class DropCommand extends BasicCommand {
private List<String> tableList;
public DropCommand(Writer o) {
super(o);
}
public ReturnMsg execute(HBaseConfiguration conf) {
if (tableList == null) {
throw new IllegalArgumentException("List of tables is null.");
}
try {
HBaseAdmin admin = new HBaseAdmin(conf);
int count = 0;
for (String table : tableList) {
if (!admin.tableExists(new Text(table))) {
println("'" + table + "' table not found.");
} else {
println("Dropping " + table + "... Please wait.");
admin.deleteTable(new Text(table));
count++;
}
}
if (count > 0) {
return new ReturnMsg(1, count + " table(s) dropped successfully.");
} else {
return new ReturnMsg(0, count + " table(s) dropped.");
}
} catch (IOException e) {
return new ReturnMsg(0, extractErrMsg(e));
}
}
public void setTableList(List<String> tableList) {
this.tableList = tableList;
}
@Override
public CommandType getCommandType() {
return CommandType.DDL;
}
}

View File

@ -1,62 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.io.Text;
/**
* Enables tables.
*/
public class EnableCommand extends BasicCommand {
private String tableName;
public EnableCommand(Writer o) {
super(o);
}
public ReturnMsg execute(HBaseConfiguration conf) {
assert tableName != null;
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(new Text(tableName))) {
return new ReturnMsg(0, "'" + tableName + "'" + TABLE_NOT_FOUND);
}
admin.enableTable(new Text(tableName));
return new ReturnMsg(1, "Table enabled successfully.");
} catch (IOException e) {
String[] msg = e.getMessage().split("[\n]");
return new ReturnMsg(0, msg[0]);
}
}
public void setTable(String table) {
this.tableName = table;
}
@Override
public CommandType getCommandType() {
return CommandType.DDL;
}
}

View File

@ -1,44 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.Writer;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.Shell;
public class ExitCommand extends BasicCommand {
public ExitCommand(Writer o) {
super(o);
}
public ReturnMsg execute(@SuppressWarnings("unused")
HBaseConfiguration conf) {
// TOD: Is this the best way to exit? Would be a problem if shell is run
// inside another program -- St.Ack 09/11/2007
System.exit(Constants.FLAG_EXIT);
return null;
}
@Override
public CommandType getCommandType() {
return CommandType.SHELL;
}
}

View File

@ -1,44 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
public class ExitException extends SecurityException {
private static final long serialVersionUID = -8085525076856622991L;
/** Status code */
private int status;
/**
* Constructs an exit exception.
*
* @param status the status code returned via System.exit()
*/
public ExitException(int status) {
this.status = status;
}
/**
* The status code returned by System.exit()
*
* @return the status code returned by System.exit()
*/
public int getStatus() {
return status;
}
}

View File

@ -1,64 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.Writer;
import java.util.List;
import org.apache.hadoop.fs.FsShell;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.util.ToolRunner;
/**
* Run hadoop filesystem commands.
*/
public class FsCommand extends BasicCommand {
private List<String> query;
public FsCommand(Writer o) {
super(o);
}
public ReturnMsg execute(@SuppressWarnings("unused")
HBaseConfiguration conf) {
// This commmand will write the
FsShell shell = new FsShell();
try {
ToolRunner.run(shell, getQuery());
shell.close();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
public void setQuery(List<String> query) {
this.query = query;
}
private String[] getQuery() {
return query.toArray(new String[] {});
}
@Override
public CommandType getCommandType() {
return CommandType.SHELL;
}
}

View File

@ -1,80 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.Writer;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.hql.generated.HQLParser;
import org.apache.hadoop.hbase.hql.generated.ParseException;
import org.apache.hadoop.hbase.hql.generated.TokenMgrError;
/**
* HQL query language service client interfaces.
*/
public class HQLClient {
static HBaseConfiguration conf;
static TableFormatter tableFormatter = null;
static Writer out = null;
/**
* Constructor
*
* @param config HBaseConfiguration
* @param ip IP Address
* @param port port number
* @param writer writer
* @param formatter table formatter
*/
public HQLClient(HBaseConfiguration config, String ip, int port, Writer writer,
TableFormatter formatter) {
conf = config;
if (ip != null && port != -1)
conf.set("hbase.master", ip + ":" + port);
out = writer;
tableFormatter = formatter;
}
/**
* Executes query.
*
* @param query
* @return ReturnMsg object
*/
public ReturnMsg executeQuery(String query) {
HQLParser parser = new HQLParser(query, out, tableFormatter);
ReturnMsg msg = null;
try {
Command cmd = parser.terminatedCommand();
if (cmd != null) {
msg = cmd.execute(conf);
}
} catch (ParseException pe) {
msg = new ReturnMsg(Constants.ERROR_CODE,
"Syntax error : Type 'help;' for usage.");
} catch (TokenMgrError te) {
msg = new ReturnMsg(Constants.ERROR_CODE,
"Lexical error : Type 'help;' for usage.");
}
return msg;
}
}

View File

@ -1,836 +0,0 @@
options {
STATIC = false;
IGNORE_CASE = true;
}
PARSER_BEGIN(HQLParser)
package org.apache.hadoop.hbase.hql.generated;
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.HashMap;
import java.io.StringReader;
import java.io.Reader;
import java.io.Writer;
import java.net.URLEncoder;
import java.io.UnsupportedEncodingException;
import org.apache.hadoop.hbase.hql.*;
/**
* Parsing command line.
*/
public class HQLParser {
private String QueryString;
private TableFormatter formatter;
private Writer out;
public HQLParser(final String query, final Writer o, final TableFormatter f) {
this((Reader)(new StringReader(query)));
this.QueryString = query;
this.formatter = f;
this.out = o;
}
public String getQueryStr() {
return this.QueryString;
}
}
PARSER_END(HQLParser)
SKIP :
{
" "
| "\t"
| "\r"
| "\n"
}
TOKEN: /** for HQL statements */
{
<HELP: "help">
| <ALTER: "alter">
| <CLEAR: "clear">
| <SHOW: "show">
| <DESCRIBE: "describe">
| <DESC: "desc">
| <CREATE: "create">
| <DROP: "drop">
| <TRUNCATE: "truncate">
| <FS: "fs">
| <JAR: "jar">
| <EXIT: "exit">
| <INSERT: "insert">
| <INTO: "into">
| <TABLE: "table">
| <DELETE: "delete">
| <SELECT: "select">
| <ENABLE: "enable">
| <DISABLE: "disable">
| <STARTING: "starting">
| <WHERE: "where">
| <FROM: "from">
| <UNTIL: "until">
| <ROW: "row">
| <VALUES: "values">
| <COLUMNFAMILIES: "columnfamilies">
| <TIMESTAMP: "timestamp">
| <NUM_VERSIONS: "num_versions">
| <LIMIT: "limit">
| <AND: "and">
| <OR: "or">
| <COMMA: ",">
| <LPAREN: "(">
| <RPAREN: ")">
| <EQUALS: "=">
| <LCOMP: ">">
| <RCOMP: "<">
| <NOT: "not">
| <IN: "in">
| <NOTEQUAL: "!=">
| <ASTERISK: "*">
| <MAX_VERSIONS: "max_versions">
| <MAX_LENGTH: "max_length">
| <COMPRESSION: "compression">
| <NONE: "none">
| <BLOCK: "block">
| <RECORD: "record">
| <IN_MEMORY: "in_memory">
| <BLOCK_CACHE_ENABLED: "block_cache_enabled">
| <TTL: "ttl">
| <BLOOMFILTER: "bloomfilter">
| <COUNTING_BLOOMFILTER: "counting_bloomfilter">
| <RETOUCHED_BLOOMFILTER: "retouched_bloomfilter">
| <VECTOR_SIZE: "vector_size">
| <NUM_HASH: "num_hash">
| <NUM_ENTRIES: "num_entries">
| <ADD: "add">
| <CHANGE: "change">
}
TOKEN : /** Functions */
{
<COUNT: "count">
}
TOKEN : /** Literals */
{
<ID: ["A"-"Z","a"-"z","_","-",":","/","~","."] (["A"-"Z","a"-"z","0"-"9","_","-",":","/","~","."])* >
| <INTEGER_LITERAL: (["0"-"9"])+ >
| <FLOATING_POINT_LITERAL:
(["0"-"9"])+ "." (["0"-"9"])+ (<EXPONENT>)?
| "." (["0"-"9"])+ (<EXPONENT>)?
| (["0"-"9"])+ <EXPONENT>
| (["0"-"9"])+ (<EXPONENT>)?
>
| <#EXPONENT: ["e","E"] (["+","-"])? (["0"-"9"])+ >
| <QUOTED_IDENTIFIER: "\"" (~["\""])+ "\"" >
| <STRING_LITERAL: "'" (~["'"])* ( "''" (~["'"])* )* "'" >
}
/**
* Parses the given array of command line arguments.
*/
Command terminatedCommand() :
{
Command statement = null;
}
{
(
[statement = cmdStatement()] ";" | <EOF>
)
{
return statement;
}
}
Command cmdStatement() :
{
Command cmd = null;
}
{
(
cmd = exitCommand()
| cmd = helpCommand()
| cmd = showCommand()
| cmd = descCommand()
| cmd = createCommand()
| cmd = dropCommand()
| cmd = truncateCommand()
| cmd = alterCommand()
| cmd = insertCommand()
| cmd = deleteCommand()
| cmd = selectCommand()
| cmd = enableCommand()
| cmd = disableCommand()
| cmd = clearCommand()
| cmd = fsCommand()
| cmd = jarCommand()
)
{
return cmd;
}
}
ExitCommand exitCommand() :
{
ExitCommand exit = new ExitCommand(this.out);
}
{
<EXIT> { return exit; }
}
FsCommand fsCommand() :
{
Token t = null;
FsCommand fs = new FsCommand(this.out);
List<String> query = new ArrayList<String>();
}
{
<FS>
(
t = <ID>
{ query.add(t.image.toString()); }
)*
{
fs.setQuery(query);
return fs;
}
}
JarCommand jarCommand() :
{
Token t = null;
JarCommand jar = new JarCommand(this.out);
List<String> query = new ArrayList<String>();
}
{
<JAR>
(
( t=<ID> | t=<INTEGER_LITERAL> | t=<FLOATING_POINT_LITERAL> )
{ query.add(t.image.toString()); }
)*
{
jar.setQuery(query);
return jar;
}
}
TruncateCommand truncateCommand() :
{
TruncateCommand truncate = new TruncateCommand(this.out);
String tableName = null;
}
{
<TRUNCATE><TABLE>
[
tableName = identifier()
]
{
truncate.setTableName(tableName);
return truncate;
}
}
HelpCommand helpCommand() :
{
Token t = null;
HelpCommand help = new HelpCommand(this.out, this.formatter);
String argument = "";
}
{
<HELP>
[
(
t=<SHOW>
| t=<DESCRIBE>
| t=<CREATE>
| t=<DROP>
| t=<EXIT>
| t=<INSERT>
| t=<DELETE>
| t=<SELECT>
| t=<ALTER>
| t=<CLEAR>
| t=<FS>
| t=<JAR>
| t=<ID>
) { argument = t.image.toString(); }
]
{
help.setArgument(argument);
return help;
}
}
ShowCommand showCommand() :
{
ShowCommand show = new ShowCommand(this.out, this.formatter);
String argument = null;
}
{
<SHOW>
[
argument = identifier()
]
{
show.setArgument(argument);
return show;
}
}
DescCommand descCommand() :
{
DescCommand desc = new DescCommand(this.out, this.formatter);
String argument = null;
}
{
( <DESCRIBE> | <DESC> )
argument = identifier()
{
desc.setArgument(argument);
return desc;
}
}
Map<String, Object> ColumnSpec() :
{
Map<String, Object> columnSpec = new HashMap<String, Object>();
int n = -1;
Token t = null;
}
{
(
<MAX_VERSIONS>
<EQUALS> n = number()
{
if(n < 0) {
n = Integer.MAX_VALUE;
}
columnSpec.put("MAX_VERSIONS", n);
}
|
<MAX_LENGTH>
<EQUALS> n = number()
{
columnSpec.put("MAX_LENGTH", n);
}
|
<COMPRESSION>
<EQUALS>
( t=<NONE>
| t=<BLOCK>
| t=<RECORD> )
{
columnSpec.put("COMPRESSION", t.image.toString());
}
|
<IN_MEMORY>
{
columnSpec.put("IN_MEMORY", true);
}
|
<BLOCK_CACHE_ENABLED>
{
columnSpec.put("BLOCK_CACHE_ENABLED", true);
}
|
<TTL>
<EQUALS> n = number()
{
columnSpec.put("TTL", n);
}
|
<BLOOMFILTER>
<EQUALS>
( t=<BLOOMFILTER>
| t=<COUNTING_BLOOMFILTER>
| t=<RETOUCHED_BLOOMFILTER>
)
{
columnSpec.put("BLOOMFILTER", t.image.toString());
}
|
<VECTOR_SIZE>
<EQUALS> n = number()
{
columnSpec.put("VECTOR_SIZE", n);
}
|
<NUM_HASH>
<EQUALS> n = number()
{
columnSpec.put("NUM_HASH", n);
}
|
<NUM_ENTRIES> <EQUALS> n = number()
{
columnSpec.put("NUM_ENTRIES", n);
}
)*
{ return columnSpec; }
}
CreateCommand createCommand() :
{
CreateCommand createCommand = new CreateCommand(this.out);
String table = null;
Map<String, Object> columnSpec = null;
String column = null;
}
{
<CREATE>
<TABLE>
table = identifier()
{
createCommand.setTable(table);
}
<LPAREN>
column = identifier()
columnSpec = ColumnSpec()
{
createCommand.addColumnSpec(column, columnSpec);
}
(
<COMMA>
column = identifier()
columnSpec = ColumnSpec()
{
createCommand.addColumnSpec(column, columnSpec);
}
)*
<RPAREN>
{ return createCommand; }
}
AlterCommand alterCommand() :
{
AlterCommand alterCommand = new AlterCommand(this.out);
String table = null;
String column = null;
Map<String, Object> columnSpec = null;
}
{
<ALTER>
<TABLE> table = identifier()
{ alterCommand.setTable(table); }
(
LOOKAHEAD(2)
<ADD> column = identifier() columnSpec = ColumnSpec()
{
alterCommand.setOperationType(AlterCommand.OperationType.ADD);
alterCommand.addColumnSpec(column, columnSpec);
}
|
<ADD>
<LPAREN>
{
alterCommand.setOperationType(AlterCommand.OperationType.ADD);
}
column = identifier() columnSpec = ColumnSpec()
{
alterCommand.addColumnSpec(column, columnSpec);
}
(
<COMMA>
column = identifier()
columnSpec = ColumnSpec()
{
alterCommand.addColumnSpec(column, columnSpec);
}
)*
<RPAREN>
|
<DROP> column = identifier()
{
alterCommand.setOperationType(AlterCommand.OperationType.DROP);
alterCommand.setColumn(column);
}
|
<CHANGE> column = identifier() columnSpec = ColumnSpec()
{
alterCommand.setOperationType(AlterCommand.OperationType.CHANGE);
alterCommand.addColumnSpec(column, columnSpec);
}
)
{ return alterCommand; }
}
DropCommand dropCommand() :
{
DropCommand drop = new DropCommand(this.out);
List<String> tableList = null;
}
{
<DROP>
<TABLE>
tableList = tableList()
{
drop.setTableList(tableList);
return drop;
}
}
InsertCommand insertCommand() :
{
InsertCommand in = new InsertCommand(this.out);
List<String> columnfamilies = null;
List<String> values = null;
String table = null;
String timestamp = null;
Token t = null;
}
{
<INSERT>
<INTO>
table = identifier()
{
in.setTable(table);
}
columnfamilies = getColumns()
{
in.setColumnfamilies(columnfamilies);
}
<VALUES> values = getLiteralValues()
{
in.setValues(values);
}
<WHERE>
<ROW> <EQUALS> ( t=<STRING_LITERAL> | t=<QUOTED_IDENTIFIER> )
{
in.setRow(t.image.substring(1, t.image.length()-1));
}
[ <TIMESTAMP>
timestamp = getStringLiteral()
{
in.setTimestamp(timestamp);
}
]
{
return in;
}
}
DeleteCommand deleteCommand() :
{
DeleteCommand deleteCommand = new DeleteCommand(this.out);
List<String> columnList = null;
Token t = null;
String table = null;
}
{
<DELETE>
columnList = columnList()
{
deleteCommand.setColumnList(columnList);
}
<FROM>
table = identifier()
{
deleteCommand.setTable(table);
}
[
<WHERE>
<ROW> <EQUALS> ( t=<STRING_LITERAL> | t=<QUOTED_IDENTIFIER> )
{
deleteCommand.setRow(t.image.substring(1, t.image.length()-1));
}
]
{ return deleteCommand; }
}
SelectCommand selectCommand() :
{
SelectCommand select = new SelectCommand(this.out, this.formatter);
List<String> columns = null;
String rowKey = "";
String stopRow = "";
String timestamp = null;
int numVersion = 0;
String tableName = null;
int limit;
}
{
<SELECT>
(
<COUNT> columns = getLiteralValues()
{ select.setCountFunction(true); }
| columns = columnList()
)
<FROM>
tableName = identifier()
{
select.setColumns(columns);
select.setTable(tableName);
}
[ ( <WHERE> <ROW> <EQUALS>
{ select.setWhere(true); }
| <STARTING> <FROM> )
rowKey = getStringLiteral()
{
select.setRowKey(rowKey);
}
[<UNTIL>
stopRow = getStringLiteral()
{select.setStopRow(stopRow);} ]
]
[ <TIMESTAMP>
timestamp = getStringLiteral()
{
select.setTimestamp(timestamp);
}
]
[
<NUM_VERSIONS><EQUALS> numVersion = number()
{
select.setVersion(numVersion);
}
]
[ <LIMIT><EQUALS> limit = number() {
try{
select.setLimit(limit);
}catch(ClassCastException ce) {
throw generateParseException();
}
} ]
{ return select; }
}
EnableCommand enableCommand() :
{
EnableCommand enableCommand = new EnableCommand(this.out);
String table = null;
}
{
<ENABLE>
table = identifier()
{
enableCommand.setTable(table);
return enableCommand;
}
}
DisableCommand disableCommand() :
{
DisableCommand disableCommand = new DisableCommand(this.out);
String table = null;
}
{
<DISABLE>
table = identifier()
{
disableCommand.setTable(table);
return disableCommand;
}
}
ClearCommand clearCommand() :
{
ClearCommand clear = new ClearCommand(this.out);
}
{
<CLEAR>
{
return clear;
}
}
List<String> getLiteralValues() :
{
List<String> values = new ArrayList<String>();
String literal = null;
}
{
<LPAREN>
{
literal = getStringLiteral();
if(literal != null) values.add(literal);
}
(
<COMMA> {
literal = getStringLiteral();
if(literal != null) values.add(literal);
}
)*
<RPAREN>
{
return values;
}
}
String getStringLiteral() :
{
Token s;
String value = null;
}
{
(
( s=<STRING_LITERAL> | s=<QUOTED_IDENTIFIER> )
{
value = s.image.toString();
return value.substring(1,value.length() - 1);
}
| ( s=<ID> | s=<INTEGER_LITERAL> | s=<ASTERISK> )
{
value = s.image.toString();
return value;
}
)
}
String getColumn() :
{
Token col;
}
{
(
( col=<ID> | col=<INTEGER_LITERAL> | col=<ASTERISK> )
{ return col.image.toString(); }
| (col=<QUOTED_IDENTIFIER> | col=<STRING_LITERAL> )
{ return col.image.substring(1,col.image.toString().length() - 1); }
)
}
List<String> getColumns() : // return parenthesized column list
{
List<String> values = new ArrayList<String>();
String literal = null;
}
{
<LPAREN>
{ literal = getColumn();
if(literal != null) values.add(literal);
}
(
<COMMA>
{
literal = getColumn();
if(literal != null) values.add(literal);
}
)*
<RPAREN>
{
return values;
}
}
List<String> tableList() :
{
List<String> tableList = new ArrayList<String>();
String table = null;
}
{
table = identifier() { tableList.add(table); }
( <COMMA> table = identifier()
{ tableList.add(table); }
)*
{ return tableList; }
}
List<String> columnList() :
{
List<String> columnList = new ArrayList<String>();
String column = null;
}
{
column = getColumn()
{
if(column != null) {
columnList.add(column);
} else {
return columnList;
}
}
( <COMMA> column = getColumn()
{ columnList.add(column); }
)*
{ return columnList; }
}
int number() :
{
Token t = null;
Token minusSignedInt = null;
}
{
( minusSignedInt=<ID> | t=<INTEGER_LITERAL> )
{
if(minusSignedInt != null) {
return Integer.parseInt(minusSignedInt.image.toString());
} else {
return Integer.parseInt(t.image.toString());
}
}
}
String identifier() :
{
Token t = null;
}
{
(
t=<ID>
{ return t.image.toString(); }
| ( t=<QUOTED_IDENTIFIER> | t=<STRING_LITERAL> )
{ return t.image.substring(1,t.image.toString().length() - 1); }
)
}
String appendIndicator(String columnName) :
{
String column = columnName;
}
{
{
return (!column.endsWith(":") && column.indexOf(":") == -1)
? column + ":" : column;
}
}

View File

@ -1,75 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.security.Permission;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.hbase.Shell;
/**
* This is intended as a replacement for the default system manager. The goal is
* to intercept System.exit calls and make it throw an exception instead so that
* a System.exit in a jar command program does not fully terminate Shell.
*
* @see ExitException
*/
public class HQLSecurityManager extends SecurityManager {
/**
* Override SecurityManager#checkExit. This throws an ExitException(status)
* exception.
*
* @param status the exit status
*/
@SuppressWarnings("static-access")
public void checkExit(int status) {
if (status != Constants.FLAG_EXIT) {
// throw new ExitException(status);
// I didn't figure out How can catch the ExitException in shell main.
// So, I just Re-launching the shell.
Shell shell = new Shell();
List<String> argList = new ArrayList<String>();
argList.add(String.valueOf(Constants.FLAG_RELAUNCH));
if(Shell.HTML_OPTION != null)
argList.add(Shell.HTML_OPTION);
if(Shell.IP != null && Shell.PORT != -1)
argList.add("--master:" + Shell.IP + ":" + Shell.PORT);
try {
shell.main(argList.toArray(new String[] {}));
} catch (IOException e) {
e.printStackTrace();
}
}
}
/**
* Override SecurityManager#checkPermission. This does nothing.
*
* @param perm the requested permission.
*/
public void checkPermission(Permission perm) {
}
}

View File

@ -1,181 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.UnsupportedEncodingException;
import java.io.Writer;
import java.util.HashMap;
import java.util.Map;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.util.VersionInfo;
public class HelpCommand extends BasicCommand {
private String argument;
private static final String[] HEADER = new String[] { "Command",
"Description", "Example" };
/** application name */
public static final String APP_NAME = "HQL";
/** help contents map */
public final Map<String, String[]> help = new HashMap<String, String[]>();
private final TableFormatter formatter;
public HelpCommand(final Writer o, final TableFormatter f) {
super(o);
this.help.putAll(load());
this.formatter = f;
}
public ReturnMsg execute(@SuppressWarnings("unused")
HBaseConfiguration conf) {
try {
printHelp(this.argument);
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
public void setArgument(String argument) {
this.argument = argument;
}
/**
* add help contents
*/
private Map<? extends String, ? extends String[]> load() {
Map<String, String[]> load = new HashMap<String, String[]>();
load.put("SHOW", new String[] { "Show information about selected title",
"SHOW TABLES [or substitution variable name];" });
load.put("FS", new String[] {
"Hadoop FsShell; entering a lone 'FS;' " + "will emit usage",
"FS [-option] arguments..;" });
load.put("JAR", new String[] { "Hadoop RunJar util",
"JAR jarFile [mainClass] arguments...;" });
load.put("CLEAR", new String[] { "Clear the screen", "CLEAR;" });
load.put("DESCRIBE", new String[] { "Print table information",
"[DESCRIBE|DESC] table_name;" });
load
.put(
"CREATE",
new String[] {
"Create tables",
"CREATE TABLE table_name (column_family_name [MAX_VERSIONS=n] "
+ "[MAX_LENGTH=n] [COMPRESSION=NONE|RECORD|BLOCK] [IN_MEMORY] [TTL=n]"
+ "[BLOOMFILTER=NONE|BLOOMFILTER|COUNTING_BLOOMFILTER|RETOUCHED_BLOOMFILTER "
+ "VECTOR_SIZE=n NUM_HASH=n], " + "...)" });
load.put("DROP", new String[] { "Drop tables",
"DROP TABLE table_name [, table_name] ...;" });
load.put("INSERT", new String[] {
"Insert values into table",
"INSERT INTO table_name (column_name, ...) "
+ "VALUES ('value', ...) WHERE row='row_key'" +
" [TIMESTAMP 'timestamp'];" });
load.put("DELETE", new String[] {
"Delete table data",
"DELETE {column_name, [, column_name] ... | *} FROM table_name "
+ "WHERE row='row-key';" });
load.put("SELECT", new String[] {
"Select values from table",
"SELECT {column_name, [, column_name] ... | expr[alias] | * } FROM table_name "
+ "[WHERE row='row_key' | STARTING FROM 'row-key' [UNTIL 'stop-key']] "
+ "[NUM_VERSIONS = version_count] " + "[TIMESTAMP 'timestamp'] "
+ "[LIMIT = row_count] " + "[INTO FILE 'file_name'];" });
load.put("ALTER", new String[] {
"Alter structure of table",
"ALTER TABLE table_name ADD column_spec | "
+ "ADD (column_spec, column_spec, ...) | "
+ "CHANGE column_family column_spec | "
+ "DROP column_family_name | " + "CHANGE column_spec;" });
load.put("TRUNCATE", new String[] {
"Truncate table is used to clean all data from a table",
"TRUNCATE TABLE table_name;" });
load.put("EXIT", new String[] { "Exit shell", "EXIT;" });
return load;
}
/**
* Print out the program version.
*
* @throws IOException
*/
public void printVersion() throws IOException {
println(APP_NAME + ", " + VersionInfo.getVersion() + " version.\n"
+ "Copyright (c) 2008 by udanax, "
+ "licensed to Apache Software Foundation.\n"
+ "Type 'help;' for usage.\n");
}
public void printHelp(final String cmd) throws IOException {
if (cmd.equals("")) {
println("Type 'help COMMAND;' to see command-specific usage.");
printHelp(this.help);
} else {
if (this.help.containsKey(cmd.toUpperCase())) {
final Map<String, String[]> m = new HashMap<String, String[]>();
m.put(cmd.toUpperCase(), this.help.get(cmd.toUpperCase()));
printHelp(m);
} else {
println("Unknown Command : Type 'help;' for usage.");
}
}
}
private void printHelp(final Map<String, String[]> m) throws IOException {
this.formatter.header(HEADER);
for (Map.Entry<String, String[]> e : m.entrySet()) {
String[] value = e.getValue();
if (value.length == 2) {
this.formatter.row(new String[] { e.getKey().toUpperCase(), value[0],
value[1] });
} else {
throw new IOException("Value has too many elements:" + value);
}
}
this.formatter.footer();
}
public static void main(String[] args) throws UnsupportedEncodingException {
HBaseConfiguration conf = new HBaseConfiguration();
Writer out = new OutputStreamWriter(System.out, "UTF-8");
TableFormatterFactory tff = new TableFormatterFactory(out, conf);
HelpCommand cmd = new HelpCommand(out, tff.get());
cmd.setArgument("");
cmd.execute(conf);
cmd.setArgument("select");
cmd.execute(conf);
}
}

View File

@ -1,124 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import java.util.List;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.hbase.MasterNotRunningException;
import org.apache.hadoop.hbase.io.BatchUpdate;
/**
* Inserts values into tables.
*/
public class InsertCommand extends BasicCommand {
private Text tableName;
private List<String> columnfamilies;
private List<String> values;
private String rowKey;
private String timestamp = null;
public InsertCommand(Writer o) {
super(o);
}
public ReturnMsg execute(HBaseConfiguration conf) {
if (tableName == null || values == null || rowKey == null)
return new ReturnMsg(0, "Syntax error : Please check 'Insert' syntax.");
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(tableName)) {
return new ReturnMsg(0, "'" + tableName + "'" + TABLE_NOT_FOUND);
}
if (columnfamilies.size() != values.size())
return new ReturnMsg(0,
"Mismatch between values list and columnfamilies list.");
try {
HTable table = new HTable(conf, tableName);
BatchUpdate batchUpdate = timestamp == null ?
new BatchUpdate(getRow().getBytes())
: new BatchUpdate(getRow().getBytes(), Long.parseLong(timestamp));
for (int i = 0; i < values.size(); i++) {
Text column = null;
if (getColumn(i).toString().contains(":"))
column = getColumn(i);
else
column = new Text(getColumn(i) + ":");
batchUpdate.put(column.getBytes(), getValue(i));
}
table.commit(batchUpdate);
return new ReturnMsg(1, "1 row inserted successfully.");
} catch (IOException e) {
String[] msg = e.getMessage().split("[\n]");
return new ReturnMsg(0, msg[0]);
}
} catch (MasterNotRunningException e) {
return new ReturnMsg(0, "Master is not running!");
}
}
public void setTable(String table) {
this.tableName = new Text(table);
}
public void setColumnfamilies(List<String> columnfamilies) {
this.columnfamilies = columnfamilies;
}
public void setValues(List<String> values) {
this.values = values;
}
public void setRow(String row) {
this.rowKey = row;
}
public Text getRow() {
return new Text(this.rowKey);
}
public Text getColumn(int i) {
return new Text(this.columnfamilies.get(i));
}
public byte[] getValue(int i) {
return this.values.get(i).getBytes();
}
public void setTimestamp(String timestamp) {
this.timestamp = timestamp;
}
@Override
public CommandType getCommandType() {
return CommandType.INSERT;
}
}

View File

@ -1,156 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.File;
import java.io.IOException;
import java.io.Writer;
import java.lang.reflect.Array;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.net.URL;
import java.net.URLClassLoader;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.jar.JarFile;
import java.util.jar.Manifest;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.util.RunJar;
/**
* Run hadoop jar commands.
*/
public class JarCommand extends BasicCommand {
private List<String> query;
public JarCommand(Writer o) {
super(o);
}
@SuppressWarnings("deprecation")
public ReturnMsg execute(@SuppressWarnings("unused")
HBaseConfiguration conf) {
try {
String[] args = getQuery();
String usage = "JAR jarFile [mainClass] args...;\n";
if (args.length < 1) {
return new ReturnMsg(0, usage);
}
int firstArg = 0;
String fileName = args[firstArg++];
File file = new File(fileName);
String mainClassName = null;
JarFile jarFile;
try {
jarFile = new JarFile(fileName);
} catch (IOException io) {
throw new IOException("Error opening job jar: " + fileName + "\n")
.initCause(io);
}
Manifest manifest = jarFile.getManifest();
if (manifest != null) {
mainClassName = manifest.getMainAttributes().getValue("Main-Class");
}
jarFile.close();
if (mainClassName == null) {
if (args.length < 2) {
return new ReturnMsg(0, usage);
}
mainClassName = args[firstArg++];
}
mainClassName = mainClassName.replaceAll("/", ".");
File tmpDir = new File(new Configuration().get("hadoop.tmp.dir"));
tmpDir.mkdirs();
if (!tmpDir.isDirectory()) {
return new ReturnMsg(0, "Mkdirs failed to create " + tmpDir + "\n");
}
final File workDir = File.createTempFile("hadoop-unjar", "", tmpDir);
workDir.delete();
workDir.mkdirs();
if (!workDir.isDirectory()) {
return new ReturnMsg(0, "Mkdirs failed to create " + workDir + "\n");
}
Runtime.getRuntime().addShutdownHook(new Thread() {
public void run() {
try {
FileUtil.fullyDelete(workDir);
} catch (IOException e) {
e.printStackTrace();
}
}
});
RunJar.unJar(file, workDir);
ArrayList<URL> classPath = new ArrayList<URL>();
classPath.add(new File(workDir + "/").toURL());
classPath.add(file.toURL());
classPath.add(new File(workDir, "classes/").toURL());
File[] libs = new File(workDir, "lib").listFiles();
if (libs != null) {
for (int i = 0; i < libs.length; i++) {
classPath.add(libs[i].toURL());
}
}
ClassLoader loader = new URLClassLoader(classPath.toArray(new URL[0]));
Thread.currentThread().setContextClassLoader(loader);
Class<?> mainClass = Class.forName(mainClassName, true, loader);
Method main = mainClass.getMethod("main", new Class[] { Array.newInstance(
String.class, 0).getClass() });
String[] newArgs = Arrays.asList(args).subList(firstArg, args.length)
.toArray(new String[0]);
try {
main.invoke(null, new Object[] { newArgs });
} catch (InvocationTargetException e) {
throw e.getTargetException();
}
} catch (Throwable e) {
e.printStackTrace();
}
return null;
}
public void setQuery(List<String> query) {
this.query = query;
}
private String[] getQuery() {
return query.toArray(new String[] {});
}
@Override
public CommandType getCommandType() {
return CommandType.SHELL;
}
}

View File

@ -1,54 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import org.apache.hadoop.hbase.HBaseConfiguration;
/**
* Message returned when a {@link Command} is
* {@link Command#execute(HBaseConfiguration)}'ed.
*/
public class ReturnMsg {
private final String msg;
private final int type;
public ReturnMsg(int i, String string) {
this.type = i;
this.msg = string;
}
public ReturnMsg(int i) {
this.type = i;
this.msg = "";
}
public String getMsg() {
return this.msg;
}
public int getType() {
return this.type;
}
@Override
public String toString() {
return this.msg;
}
}

View File

@ -1,119 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.Writer;
import java.util.Map;
import java.util.Set;
import org.apache.hadoop.hbase.BloomFilterDescriptor;
import org.apache.hadoop.hbase.BloomFilterDescriptor.BloomFilterType;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.io.Text;
/**
* The base class of schema modification commands, CreateCommand and Alter
* Command. Provides utility methods for alteration operations.
*/
public abstract class SchemaModificationCommand extends BasicCommand {
protected int maxVersions;
protected int maxLength;
protected HColumnDescriptor.CompressionType compression;
protected boolean inMemory;
protected boolean blockCacheEnabled;
protected BloomFilterDescriptor bloomFilterDesc;
protected BloomFilterType bloomFilterType;
protected int vectorSize;
protected int numHash;
protected int numEntries;
protected int timeToLive;
public SchemaModificationCommand(Writer o) {
super(o);
}
protected void initOptions() {
maxVersions = HColumnDescriptor.DEFAULT_N_VERSIONS;
maxLength = HColumnDescriptor.DEFAULT_MAX_VALUE_LENGTH;
compression = HColumnDescriptor.DEFAULT_COMPRESSION_TYPE;
inMemory = HColumnDescriptor.DEFAULT_IN_MEMORY;
blockCacheEnabled = HColumnDescriptor.DEFAULT_BLOCK_CACHE_ENABLED;
bloomFilterDesc = HColumnDescriptor.DEFAULT_BLOOM_FILTER_DESCRIPTOR;
timeToLive = HColumnDescriptor.DEFAULT_TIME_TO_LIVE;
}
/**
* Given a column name and column spec, returns an instance of
* HColumnDescriptor representing the column spec.
*/
protected HColumnDescriptor getColumnDescriptor(String column,
Map<String, Object> columnSpec) throws IllegalArgumentException {
initOptions();
Set<String> specs = columnSpec.keySet();
for (String spec : specs) {
spec = spec.toUpperCase();
if (spec.equals("MAX_VERSIONS")) {
maxVersions = (Integer) columnSpec.get(spec);
} else if (spec.equals("MAX_LENGTH")) {
maxLength = (Integer) columnSpec.get(spec);
} else if (spec.equals("COMPRESSION")) {
compression = HColumnDescriptor.CompressionType
.valueOf(((String) columnSpec.get(spec)).toUpperCase());
} else if (spec.equals("IN_MEMORY")) {
inMemory = (Boolean) columnSpec.get(spec);
} else if (spec.equals("BLOCK_CACHE_ENABLED")) {
blockCacheEnabled = (Boolean) columnSpec.get(spec);
} else if (spec.equals("BLOOMFILTER")) {
bloomFilterType = BloomFilterType.valueOf(((String) columnSpec.get(spec))
.toUpperCase());
} else if (spec.equals("VECTOR_SIZE")) {
vectorSize = (Integer) columnSpec.get(spec);
} else if (spec.equals("NUM_HASH")) {
numHash = (Integer) columnSpec.get(spec);
} else if (spec.equals("NUM_ENTRIES")) {
numEntries = (Integer) columnSpec.get(spec);
} else if (spec.equals("TTL")) {
timeToLive = (Integer) columnSpec.get(spec);
} else {
throw new IllegalArgumentException("Invalid option: " + spec);
}
}
// Now we gather all the specified options for this column.
if (bloomFilterType != null) {
if (specs.contains("NUM_ENTRIES")) {
bloomFilterDesc = new BloomFilterDescriptor(bloomFilterType, numEntries);
} else {
bloomFilterDesc = new BloomFilterDescriptor(bloomFilterType, vectorSize,
numHash);
}
}
column = appendDelimiter(column);
HColumnDescriptor columnDesc = new HColumnDescriptor(column.getBytes(),
maxVersions, compression, inMemory, blockCacheEnabled,
maxLength, timeToLive, bloomFilterDesc);
return columnDesc;
}
}

View File

@ -1,395 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.Writer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HConstants;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.Shell;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Scanner;
import org.apache.hadoop.hbase.filter.RowFilterInterface;
import org.apache.hadoop.hbase.filter.StopRowFilter;
import org.apache.hadoop.hbase.filter.WhileMatchRowFilter;
import org.apache.hadoop.hbase.hql.generated.HQLParser;
import org.apache.hadoop.hbase.io.Cell;
import org.apache.hadoop.hbase.io.RowResult;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.hbase.util.Writables;
import org.apache.hadoop.io.Text;
/**
* Selects values from tables.
*/
public class SelectCommand extends BasicCommand {
private Text tableName;
private Text rowKey = new Text("");
private Text stopRow = new Text("");
private List<String> columns;
private long timestamp;
private int limit;
// Count of versions to return.
private int version;
private boolean countFunction = false;
private boolean whereClause = false;
private static final String[] HEADER_ROW_CELL = new String[] { "Row", "Cell" };
private static final String[] HEADER_COLUMN_CELL = new String[] { "Column", "Cell" };
private static final String[] HEADER = new String[] { "Row", "Column", "Cell" };
private static final String ASTERISK = "*";
private final TableFormatter formatter;
// Not instantiable
@SuppressWarnings("unused")
private SelectCommand() {
this(null, null);
}
public SelectCommand(final Writer o, final TableFormatter f) {
super(o);
this.formatter = f;
}
public ReturnMsg execute(final HBaseConfiguration conf) {
if (tableName.equals("") || rowKey == null || columns.size() == 0) {
return new ReturnMsg(0, "Syntax error : Please check 'Select' syntax.");
}
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(tableName) && !isMetaTable()) {
return new ReturnMsg(0, "'" + tableName + "'" + TABLE_NOT_FOUND);
}
HTable table = new HTable(conf, tableName);
int count = 0;
if (whereClause) {
if (countFunction) {
count = 1;
} else {
count = compoundWherePrint(table, admin);
}
} else {
count = scanPrint(table, admin);
}
return new ReturnMsg(1, Integer.toString(count) + " row(s) in set.");
} catch (IOException e) {
String[] msg = e.getMessage().split("[,]");
return new ReturnMsg(0, msg[0]);
}
}
private boolean isMetaTable() {
return (tableName.equals(new Text(HConstants.ROOT_TABLE_NAME)) ||
tableName.equals(new Text(HConstants.META_TABLE_NAME))) ? true : false;
}
private int compoundWherePrint(HTable table, HBaseAdmin admin) {
int count = 0;
try {
if (version != 0) {
// A number of versions has been specified.
Cell[] result = null;
ParsedColumns parsedColumns = getColumns(admin, false);
boolean multiple = parsedColumns.isMultiple() || version > 1;
for (byte [] column : parsedColumns.getColumns()) {
if (count == 0) {
formatter.header(multiple ? HEADER_COLUMN_CELL : null);
}
if (timestamp != 0) {
result = table.get(rowKey.getBytes(), column, timestamp, version);
} else {
result = table.get(rowKey.getBytes(), column, version);
}
for (int ii = 0; result != null && ii < result.length; ii++) {
if (multiple) {
formatter.row(new String[] { column.toString(),
toString(column, result[ii].getValue()) });
} else {
formatter.row(new String[] { toString(column, result[ii].getValue()) });
}
count++;
}
}
} else {
for (Map.Entry<byte [], Cell> e : table.getRow(rowKey).entrySet()) {
if (count == 0) {
formatter.header(isMultiple() ? HEADER_COLUMN_CELL : null);
}
byte [] key = e.getKey();
String keyStr = key.toString();
if (!columns.contains(ASTERISK) && !columns.contains(keyStr)) {
continue;
}
String cellData = toString(key, e.getValue().getValue());
if (isMultiple()) {
formatter.row(new String[] { key.toString(), cellData });
} else {
formatter.row(new String[] { cellData });
}
count++;
}
}
if (count == 0 && Shell.HTML_OPTION != null) {
formatter.header(isMultiple() ? HEADER_COLUMN_CELL : null);
}
formatter.footer();
} catch (IOException e) {
e.printStackTrace();
}
return 1;
}
private String toString(final byte [] columnName, final byte[] cell)
throws IOException {
String result = null;
if (Bytes.equals(columnName, HConstants.COL_REGIONINFO)
|| Bytes.equals(columnName, HConstants.COL_SPLITA)
|| Bytes.equals(columnName, HConstants.COL_SPLITB)) {
result = Writables.getHRegionInfoOrNull(cell).toString();
} else if (Bytes.equals(columnName, HConstants.COL_STARTCODE)) {
result = Long.toString(Bytes.toLong(cell));
} else {
result = Bytes.toString(cell);
}
return result;
}
private String toString(final byte [] columnName, final Cell cell)
throws IOException {
if (cell == null) {
return null;
}
return toString(columnName, cell.getValue());
}
/**
* Data structure with columns to use scanning and whether or not the scan
* could return more than one column.
*/
class ParsedColumns {
private final List<byte []> cols;
private final boolean isMultiple;
ParsedColumns(final List<byte []> columns) {
this(columns, true);
}
ParsedColumns(final List<byte []> columns, final boolean isMultiple) {
this.cols = columns;
this.isMultiple = isMultiple;
}
public List<byte []> getColumns() {
return this.cols;
}
public boolean isMultiple() {
return this.isMultiple;
}
}
private int scanPrint(HTable table, HBaseAdmin admin) {
int count = 0;
Scanner scan = null;
try {
ParsedColumns parsedColumns = getColumns(admin, true);
Text[] cols = parsedColumns.getColumns().toArray(new Text[] {});
if (timestamp == 0) {
scan = table.getScanner(cols, rowKey);
} else {
scan = table.getScanner(Bytes.toByteArrays(cols), rowKey.getBytes(),
timestamp);
}
if (this.stopRow.toString().length() > 0) {
RowFilterInterface filter = new WhileMatchRowFilter(new StopRowFilter(
stopRow.getBytes()));
scan = table.getScanner(Bytes.toByteArrays(cols), rowKey.getBytes(), filter);
}
RowResult results = scan.next();
// If only one column in query, then don't print out the column.
while (results != null && checkLimit(count)) {
if (count == 0 && !countFunction) {
formatter.header((parsedColumns.isMultiple()) ? HEADER : HEADER_ROW_CELL);
}
byte [] r = results.getRow();
if (!countFunction) {
for (byte [] columnKey : results.keySet()) {
String cellData = toString(columnKey, results.get(columnKey));
if (parsedColumns.isMultiple()) {
formatter.row(new String[] { r.toString(), columnKey.toString(),
cellData });
} else {
// Don't print out the column since only one specified in query.
formatter.row(new String[] { r.toString(), cellData });
}
if (limit > 0 && count >= limit) {
break;
}
}
}
count++;
results = scan.next();
}
if (count == 0 && Shell.HTML_OPTION != null && !countFunction) {
formatter.header((parsedColumns.isMultiple()) ? HEADER : HEADER_ROW_CELL);
}
formatter.footer();
scan.close();
} catch (IOException e) {
e.printStackTrace();
}
return count;
}
/**
* Make sense of the supplied list of columns.
*
* @param admin Admin to use.
* @return Interpretation of supplied list of columns.
*/
public ParsedColumns getColumns(final HBaseAdmin admin, final boolean scanning) {
ParsedColumns result = null;
try {
if (columns.contains(ASTERISK)) {
if (tableName.equals(new Text(HConstants.ROOT_TABLE_NAME))
|| tableName.equals(new Text(HConstants.META_TABLE_NAME))) {
result = new ParsedColumns(Arrays.asList(HConstants.COLUMN_FAMILY_ARRAY));
} else {
HTableDescriptor[] tables = admin.listTables();
for (int i = 0; i < tables.length; i++) {
if (tables[i].getNameAsString().equals(tableName.toString())) {
List<byte []> cols = new ArrayList<byte []>();
for (HColumnDescriptor h: tables[i].getFamilies()) {
cols.add(h.getName());
}
result = new ParsedColumns(cols);
break;
}
}
}
} else {
List<byte []> tmpList = new ArrayList<byte []>();
for (int i = 0; i < columns.size(); i++) {
byte [] column = null;
// Add '$' to column name if we are scanning. Scanners support
// regex column names. Adding '$', the column becomes a
// regex that does an explicit match on the supplied column name.
// Otherwise, if the specified column is a column family, then
// default behavior is to fetch all columns that have a matching
// column family.
column = (columns.get(i).contains(":")) ? new Text(columns.get(i)
+ (scanning ? "$" : "")).getBytes() : new Text(columns.get(i) + ":"
+ (scanning ? "$" : "")).getBytes();
tmpList.add(column);
}
result = new ParsedColumns(tmpList, tmpList.size() > 1);
}
} catch (IOException e) {
e.printStackTrace();
}
return result;
}
/*
* @return True if query contains multiple columns.
*/
private boolean isMultiple() {
return this.columns.size() > 1 || this.columns.contains(ASTERISK);
}
private boolean checkLimit(int count) {
return (this.limit == 0) ? true : (this.limit > count) ? true : false;
}
public void setTable(String table) {
this.tableName = new Text(table);
}
public void setLimit(int limit) {
this.limit = limit;
}
public void setWhere(boolean isWhereClause) {
if (isWhereClause)
this.whereClause = true;
}
public void setTimestamp(String timestamp) {
this.timestamp = Long.parseLong(timestamp);
}
public void setColumns(List<String> columns) {
this.columns = columns;
}
public void setRowKey(String rowKey) {
if (rowKey == null)
this.rowKey = null;
else
this.rowKey = new Text(rowKey);
}
public void setCountFunction(boolean countFunction) {
this.countFunction = countFunction;
}
public void setStopRow(String stopRow) {
this.stopRow = new Text(stopRow);
}
/**
* @param version Set maximum versions for this selection
*/
public void setVersion(int version) {
this.version = version;
}
public static void main(String[] args) throws Exception {
Writer out = new OutputStreamWriter(System.out, "UTF-8");
HBaseConfiguration c = new HBaseConfiguration();
// For debugging
TableFormatterFactory tff = new TableFormatterFactory(out, c);
HQLParser parser = new HQLParser("select * from 'x' where row='x';", out, tff.get());
Command cmd = parser.terminatedCommand();
ReturnMsg rm = cmd.execute(c);
out.write(rm == null ? "" : rm.toString());
out.flush();
}
}

View File

@ -1,81 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HTableDescriptor;
/**
* Shows all available tables.
*/
public class ShowCommand extends BasicCommand {
private static final String[] HEADER = new String[] { "Name", "Descriptor" };
private String command;
private final TableFormatter formatter;
// Not instantiable
@SuppressWarnings("unused")
private ShowCommand() {
this(null, null);
}
public ShowCommand(final Writer o, final TableFormatter f) {
this(o, f, null);
}
public ShowCommand(final Writer o, final TableFormatter f,
final String argument) {
super(o);
this.formatter = f;
this.command = argument;
}
public ReturnMsg execute(final HBaseConfiguration conf) {
if (command == null) {
return new ReturnMsg(0, "Syntax error : Please check 'Show' syntax.");
}
try {
HBaseAdmin admin = new HBaseAdmin(conf);
int tableLength = 0;
HTableDescriptor[] tables = admin.listTables();
tableLength = tables.length;
if (tableLength == 0) {
return new ReturnMsg(0, "No tables found.");
}
formatter.header(HEADER);
for (int i = 0; i < tableLength; i++) {
String tableName = tables[i].getName().toString();
formatter.row(new String[] { tableName, tables[i].toString() });
}
formatter.footer();
return new ReturnMsg(1, tableLength + " table(s) in set.");
} catch (IOException e) {
return new ReturnMsg(0, "error msg : " + e.toString());
}
}
public void setArgument(String argument) {
this.command = argument;
}
}

View File

@ -1,63 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.hql.formatter.AsciiTableFormatter;
/**
* Interface implemented by table formatters outputting select results.
* Implementations must have a constructor that takes a Writer.
*
* @see AsciiTableFormatter
*/
public interface TableFormatter {
/**
* Output header.
*
* @param titles Titles to emit.
* @throws IOException
*/
public void header(final String[] titles) throws IOException;
/**
* Output footer.
*
* @throws IOException
*/
public void footer() throws IOException;
/**
* Output a row.
*
* @param cells
* @throws IOException
*/
public void row(final String[] cells) throws IOException;
/**
* @return Output stream being used (This is in interface to enforce fact that
* formatters use Writers -- that they operate on character streams
* rather than on byte streams).
*/
public Writer getOut();
}

View File

@ -1,83 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.Writer;
import java.lang.reflect.Constructor;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.hql.formatter.AsciiTableFormatter;
/**
* Table formatter. Specify formatter by setting "hbaseshell.formatter" property
* in <code>hbase-site.xml</code> or by setting system property
* <code>hbaseshell.formatter</code>. System property setting prevails over
* all other configurations. Outputs UTF-8 encoded Strings even if original data
* is binary. On static initialization, changes System.out to be a UTF-8 output
* stream. .
* <p>
* TODO: Mysql has --skip-column-names and --silent which inserts a tab as
* separator. Also has --html and --xml.
* <p>
* To use the html formatter, currently set HBASE_OPTS as in:
* <code>$ HBASE_OPTS="-Dhbaseshell.formatter=org.apache.hadoop.hbase.shell.formatter.HtmlTableFormatter" ./bin/hbase shell</code>
* </p>
*/
public class TableFormatterFactory {
private static final Log LOG = LogFactory.getLog(TableFormatterFactory.class
.getName());
private static final String FORMATTER_KEY = "hbaseshell.formatter";
private final TableFormatter formatter;
/**
* Not instantiable
*/
@SuppressWarnings( { "unchecked", "unused" })
private TableFormatterFactory() {
this(null, null);
}
@SuppressWarnings("unchecked")
public TableFormatterFactory(final Writer out, final Configuration c) {
String className = System.getProperty(FORMATTER_KEY);
if (className == null) {
className = c.get(FORMATTER_KEY, AsciiTableFormatter.class.getName());
}
LOG.debug("Table formatter class: " + className);
try {
Class<TableFormatter> clazz = (Class<TableFormatter>) Class
.forName(className);
Constructor<?> constructor = clazz.getConstructor(Writer.class);
this.formatter = (TableFormatter) constructor.newInstance(out);
} catch (Exception e) {
throw new RuntimeException("Failed instantiation of " + className, e);
}
}
/**
* @return The table formatter instance
*/
@SuppressWarnings("unchecked")
public TableFormatter get() {
return this.formatter;
}
}

View File

@ -1,82 +0,0 @@
/**
* Copyright 2007 The Apache Software Foundation
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.client.HBaseAdmin;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.io.Text;
/**
* Truncate table is used to clean all data from a table.
*/
public class TruncateCommand extends BasicCommand {
private Text tableName;
public TruncateCommand(Writer o) {
super(o);
}
public ReturnMsg execute(final HBaseConfiguration conf) {
if (this.tableName == null)
return new ReturnMsg(0, "Syntax error : Please check 'Truncate' syntax.");
try {
HBaseAdmin admin = new HBaseAdmin(conf);
if (!admin.tableExists(tableName)) {
return new ReturnMsg(0, "Table not found.");
}
HTableDescriptor[] tables = admin.listTables();
HColumnDescriptor[] columns = null;
for (int i = 0; i < tables.length; i++) {
if (tables[i].getNameAsString().equals(tableName.toString())) {
columns = tables[i].getFamilies().toArray(
new HColumnDescriptor[] {});
break;
}
}
println("Truncating a '" + tableName + "' table ... Please wait.");
admin.deleteTable(tableName); // delete the table
HTableDescriptor tableDesc = new HTableDescriptor(tableName.getBytes());
for (int i = 0; i < columns.length; i++) {
tableDesc.addFamily(columns[i]);
}
admin.createTable(tableDesc); // re-create the table
} catch (IOException e) {
return new ReturnMsg(0, "error msg : " + e.toString());
}
return new ReturnMsg(0, "'" + tableName + "' is successfully truncated.");
}
public void setTableName(String tableName) {
this.tableName = new Text(tableName);
}
@Override
public CommandType getCommandType() {
return CommandType.DDL;
}
}

View File

@ -1,168 +0,0 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql.formatter;
import java.io.IOException;
import java.io.Writer;
import org.apache.hadoop.hbase.hql.TableFormatter;
/**
* Formatter that outputs data inside an ASCII table.
* If only a single cell result, then no formatting is done. Presumption is
* that client manages serial access outputting tables. Does not close passed
* {@link Writer}.
*/
public class AsciiTableFormatter implements TableFormatter {
private static final String COLUMN_DELIMITER = "| ";
private static final String COLUMN_CLOSER = "|";
private static final int DEFAULT_COLUMN_WIDTH = 26;
// Width is a line of content + delimiter
private int columnWidth = DEFAULT_COLUMN_WIDTH;
// Amount of width to use for a line of content.
private int columnContentWidth =
DEFAULT_COLUMN_WIDTH - COLUMN_DELIMITER.length();
// COLUMN_LINE is put at head and foot of a column and per column, is drawn
// as row delimiter
private String columnHorizLine;
private final String COLUMN_HORIZ_LINE_CLOSER = "+";
// Used padding content to fill column
private final String PADDING_CHAR = " ";
// True if we are to output no formatting.
private boolean noFormatting = false;
private final Writer out;
private final String LINE_SEPARATOR = System.getProperty("line.separator");
// Not instantiable
@SuppressWarnings("unused")
private AsciiTableFormatter() {
this(null);
}
public AsciiTableFormatter(final Writer o) {
this.out = o;
}
public Writer getOut() {
return this.out;
}
/**
* @param titles List of titles. Pass null if no formatting (i.e.
* no header, no footer, etc.
* @throws IOException
*/
public void header(String[] titles) throws IOException {
if (titles == null) {
// print nothing.
setNoFormatting(true);
return;
}
// Calculate width of columns.
this.columnWidth = titles.length == 1? 3 * DEFAULT_COLUMN_WIDTH:
titles.length == 2? 39: DEFAULT_COLUMN_WIDTH;
this.columnContentWidth = this.columnWidth - COLUMN_DELIMITER.length();
// Create the horizontal line to draw across the top of each column.
this.columnHorizLine = calculateColumnHorizLine(this.columnWidth);
// Print out a column topper per column.
printRowDelimiter(titles.length);
row(titles);
}
public void row(String [] cells) throws IOException {
if (isNoFormatting()) {
getOut().write(cells[0]);
getOut().flush();
return;
}
// Ok. Output cells a line at a time w/ delimiters between cells.
int [] indexes = new int[cells.length];
for (int i = 0; i < indexes.length; i++) {
indexes[i] = 0;
}
int allFinished = 0;
while (allFinished < indexes.length) {
StringBuffer sb = new StringBuffer();
for (int i = 0; i < cells.length; i++) {
sb.append(COLUMN_DELIMITER);
int offset = indexes[i];
if (offset + this.columnContentWidth >= cells[i].length()) {
String substr = cells[i].substring(offset);
if (substr.length() > 0) {
// This column is finished
allFinished++;
sb.append(substr);
}
for (int j = 0; j < this.columnContentWidth - substr.length(); j++) {
sb.append(PADDING_CHAR);
}
indexes[i] = cells[i].length();
} else {
String substr = cells[i].substring(indexes[i],
indexes[i] + this.columnContentWidth);
indexes[i] += this.columnContentWidth;
sb.append(substr);
}
}
sb.append(COLUMN_CLOSER);
getOut().write(sb.toString());
getOut().write(LINE_SEPARATOR);
getOut().flush();
}
printRowDelimiter(cells.length);
}
public void footer() throws IOException {
if (isNoFormatting()) {
// If no formatting, output a newline to delimit cell and the
// result summary output at end of every command.
getOut().write(LINE_SEPARATOR);
getOut().flush();
}
// We're done. Clear flag.
setNoFormatting(false);
}
private void printRowDelimiter(final int columnCount) throws IOException {
for (int i = 0; i < columnCount; i++) {
getOut().write(this.columnHorizLine);
}
getOut().write(COLUMN_HORIZ_LINE_CLOSER);
getOut().write(LINE_SEPARATOR);
getOut().flush();
}
private String calculateColumnHorizLine(final int width) {
StringBuffer sb = new StringBuffer();
sb.append("+");
for (int i = 1; i < width; i++) {
sb.append("-");
}
return sb.toString();
}
public boolean isNoFormatting() {
return this.noFormatting;
}
public void setNoFormatting(boolean noFormatting) {
this.noFormatting = noFormatting;
}
}

View File

@ -1,186 +0,0 @@
/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.hadoop.hbase.hql.formatter;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.io.UnsupportedEncodingException;
import java.io.Writer;
import org.apache.hadoop.hbase.hql.TableFormatter;
import org.znerd.xmlenc.LineBreak;
import org.znerd.xmlenc.XMLOutputter;
import org.znerd.xmlenc.XMLEncoder;
import org.znerd.xmlenc.InvalidXMLException;
/**
* Formatter that outputs data inside an HTML table. If only a single cell
* result, then no formatting is done. Presumption is that client manages
* serial access outputting tables. Does not close passed {@link Writer}.
* Since hbase columns have no typing, the formatter presumes a type of
* UTF-8 String. If cells contain images, etc., this formatter will mangle
* their display.
* <p>TODO: Uses xmlenc. Hopefully it flushes every so often (Claims its a
* stream-based outputter). Verify.
*/
public class HtmlTableFormatter implements TableFormatter {
private final XMLOutputter outputter;
private boolean noFormatting = false;
private final Writer out;
// Uninstantiable
@SuppressWarnings("unused")
private HtmlTableFormatter() {
this(null);
}
/*
* An encoder that replaces illegal XML characters with the '@' sign.
*/
private static class HbaseXMLEncoder extends XMLEncoder {
@SuppressWarnings("deprecation")
public HbaseXMLEncoder()
throws IllegalArgumentException, UnsupportedEncodingException {
super("UTF-8");
}
@Override
public void text(Writer w, char c, boolean escape)
throws InvalidXMLException, IOException {
super.text(w, legalize(c), escape);
}
@Override
public void text(Writer w, char[] cs, int start, int length, boolean b)
throws NullPointerException, IndexOutOfBoundsException,
InvalidXMLException, IOException {
for (int i = start; i < start + length; i++) {
cs[i] = legalize(cs[i]);
}
super.text(w, cs, start, length, b);
}
/**
* If character is in range A, C, or E, then replace with '@'
* <pre>
* A 0-8 Control characters -- Not allowed in XML 1.0 --
* B 9-10 Normal characters Never needed
* C 11-12 Control characters -- Not allowed in XML 1.0 --
* D 13 Normal character Never needed
* E 14-31 Control characters -- Not allowed in XML 1.0 --
* </pre>
* @param c Character to look at.
* @return
*/
private char legalize(final char c) {
return (c <= 8 || c == 11 || c == 12 || (c >= 14 && c <= 31))? '@': c;
}
}
public HtmlTableFormatter(final Writer o) {
this.out = o;
try {
// Looking at the xmlenc source, there should be no issue w/ wrapping
// the stream -- i.e. no hanging resources.
this.outputter = new XMLOutputter(this.out, new HbaseXMLEncoder());
String os = System.getProperty("os.name").toLowerCase();
// Shell likes the DOS output.
this.outputter.setLineBreak(os.contains("windows")?
LineBreak.DOS: LineBreak.UNIX);
this.outputter.setIndentation(" ");
} catch (Exception e) {
throw new RuntimeException(e);
}
}
/**
* @param titles List of titles. Pass null if no formatting (i.e.
* no header, no footer, etc.
* @throws IOException
*/
@SuppressWarnings("static-access")
public void header(String[] titles) throws IOException {
if (titles == null) {
// print nothing.
setNoFormatting(true);
return;
}
this.outputter.setState(this.outputter.BEFORE_XML_DECLARATION, null);
// Can't add a 'border=1' attribute because its included on the end in
this.outputter.startTag("table");
this.outputter.startTag("tr");
for (int i = 0; i < titles.length; i++) {
this.outputter.startTag("th");
this.outputter.pcdata(titles[i]);
this.outputter.endTag();
}
this.outputter.endTag();
}
public void row(String [] cells) throws IOException{
if (isNoFormatting()) {
getOut().write(cells[0]);
return;
}
this.outputter.startTag("tr");
for (int i = 0; i < cells.length; i++) {
this.outputter.startTag("td");
this.outputter.pcdata(cells[i]);
this.outputter.endTag();
}
this.outputter.endTag();
}
public void footer() throws IOException {
if (!isNoFormatting()) {
// To close the table
this.outputter.endTag();
this.outputter.endDocument();
}
// We're done. Clear flag.
this.setNoFormatting(false);
// If no formatting, output a newline to delimit cell and the
// result summary output at end of every command. If html, also emit a
// newline to delimit html and summary line.
getOut().write(System.getProperty("line.separator"));
getOut().flush();
}
public Writer getOut() {
return this.out;
}
public boolean isNoFormatting() {
return this.noFormatting;
}
public void setNoFormatting(boolean noFormatting) {
this.noFormatting = noFormatting;
}
public static void main(String[] args) throws IOException {
HtmlTableFormatter f =
new HtmlTableFormatter(new OutputStreamWriter(System.out, "UTF-8"));
f.header(new String [] {"a", "b"});
f.row(new String [] {"a", "b"});
f.footer();
}
}

File diff suppressed because it is too large Load Diff

View File

@ -1,149 +0,0 @@
/* Generated By:JavaCC: Do not edit this line. HQLParserConstants.java */
package org.apache.hadoop.hbase.hql.generated;
public interface HQLParserConstants {
int EOF = 0;
int HELP = 5;
int ALTER = 6;
int CLEAR = 7;
int SHOW = 8;
int DESCRIBE = 9;
int DESC = 10;
int CREATE = 11;
int DROP = 12;
int TRUNCATE = 13;
int FS = 14;
int JAR = 15;
int EXIT = 16;
int INSERT = 17;
int INTO = 18;
int TABLE = 19;
int DELETE = 20;
int SELECT = 21;
int ENABLE = 22;
int DISABLE = 23;
int STARTING = 24;
int WHERE = 25;
int FROM = 26;
int UNTIL = 27;
int ROW = 28;
int VALUES = 29;
int COLUMNFAMILIES = 30;
int TIMESTAMP = 31;
int NUM_VERSIONS = 32;
int LIMIT = 33;
int AND = 34;
int OR = 35;
int COMMA = 36;
int LPAREN = 37;
int RPAREN = 38;
int EQUALS = 39;
int LCOMP = 40;
int RCOMP = 41;
int NOT = 42;
int IN = 43;
int NOTEQUAL = 44;
int ASTERISK = 45;
int MAX_VERSIONS = 46;
int MAX_LENGTH = 47;
int COMPRESSION = 48;
int NONE = 49;
int BLOCK = 50;
int RECORD = 51;
int IN_MEMORY = 52;
int BLOCK_CACHE_ENABLED = 53;
int TTL = 54;
int BLOOMFILTER = 55;
int COUNTING_BLOOMFILTER = 56;
int RETOUCHED_BLOOMFILTER = 57;
int VECTOR_SIZE = 58;
int NUM_HASH = 59;
int NUM_ENTRIES = 60;
int ADD = 61;
int CHANGE = 62;
int COUNT = 63;
int ID = 64;
int INTEGER_LITERAL = 65;
int FLOATING_POINT_LITERAL = 66;
int EXPONENT = 67;
int QUOTED_IDENTIFIER = 68;
int STRING_LITERAL = 69;
int DEFAULT = 0;
String[] tokenImage = {
"<EOF>",
"\" \"",
"\"\\t\"",
"\"\\r\"",
"\"\\n\"",
"\"help\"",
"\"alter\"",
"\"clear\"",
"\"show\"",
"\"describe\"",
"\"desc\"",
"\"create\"",
"\"drop\"",
"\"truncate\"",
"\"fs\"",
"\"jar\"",
"\"exit\"",
"\"insert\"",
"\"into\"",
"\"table\"",
"\"delete\"",
"\"select\"",
"\"enable\"",
"\"disable\"",
"\"starting\"",
"\"where\"",
"\"from\"",
"\"until\"",
"\"row\"",
"\"values\"",
"\"columnfamilies\"",
"\"timestamp\"",
"\"num_versions\"",
"\"limit\"",
"\"and\"",
"\"or\"",
"\",\"",
"\"(\"",
"\")\"",
"\"=\"",
"\">\"",
"\"<\"",
"\"not\"",
"\"in\"",
"\"!=\"",
"\"*\"",
"\"max_versions\"",
"\"max_length\"",
"\"compression\"",
"\"none\"",
"\"block\"",
"\"record\"",
"\"in_memory\"",
"\"block_cache_enabled\"",
"\"ttl\"",
"\"bloomfilter\"",
"\"counting_bloomfilter\"",
"\"retouched_bloomfilter\"",
"\"vector_size\"",
"\"num_hash\"",
"\"num_entries\"",
"\"add\"",
"\"change\"",
"\"count\"",
"<ID>",
"<INTEGER_LITERAL>",
"<FLOATING_POINT_LITERAL>",
"<EXPONENT>",
"<QUOTED_IDENTIFIER>",
"<STRING_LITERAL>",
"\";\"",
};
}

View File

@ -1,192 +0,0 @@
/* Generated By:JavaCC: Do not edit this line. ParseException.java Version 3.0 */
package org.apache.hadoop.hbase.hql.generated;
/**
* This exception is thrown when parse errors are encountered.
* You can explicitly create objects of this exception type by
* calling the method generateParseException in the generated
* parser.
*
* You can modify this class to customize your error reporting
* mechanisms so long as you retain the public fields.
*/
public class ParseException extends Exception {
/**
* This constructor is used by the method "generateParseException"
* in the generated parser. Calling this constructor generates
* a new object of this type with the fields "currentToken",
* "expectedTokenSequences", and "tokenImage" set. The boolean
* flag "specialConstructor" is also set to true to indicate that
* this constructor was used to create this object.
* This constructor calls its super class with the empty string
* to force the "toString" method of parent class "Throwable" to
* print the error message in the form:
* ParseException: <result of getMessage>
*/
public ParseException(Token currentTokenVal,
int[][] expectedTokenSequencesVal,
String[] tokenImageVal
)
{
super("");
specialConstructor = true;
currentToken = currentTokenVal;
expectedTokenSequences = expectedTokenSequencesVal;
tokenImage = tokenImageVal;
}
/**
* The following constructors are for use by you for whatever
* purpose you can think of. Constructing the exception in this
* manner makes the exception behave in the normal way - i.e., as
* documented in the class "Throwable". The fields "errorToken",
* "expectedTokenSequences", and "tokenImage" do not contain
* relevant information. The JavaCC generated code does not use
* these constructors.
*/
public ParseException() {
super();
specialConstructor = false;
}
public ParseException(String message) {
super(message);
specialConstructor = false;
}
/**
* This variable determines which constructor was used to create
* this object and thereby affects the semantics of the
* "getMessage" method (see below).
*/
protected boolean specialConstructor;
/**
* This is the last token that has been consumed successfully. If
* this object has been created due to a parse error, the token
* followng this token will (therefore) be the first error token.
*/
public Token currentToken;
/**
* Each entry in this array is an array of integers. Each array
* of integers represents a sequence of tokens (by their ordinal
* values) that is expected at this point of the parse.
*/
public int[][] expectedTokenSequences;
/**
* This is a reference to the "tokenImage" array of the generated
* parser within which the parse error occurred. This array is
* defined in the generated ...Constants interface.
*/
public String[] tokenImage;
/**
* This method has the standard behavior when this object has been
* created using the standard constructors. Otherwise, it uses
* "currentToken" and "expectedTokenSequences" to generate a parse
* error message and returns it. If this object has been created
* due to a parse error, and you do not catch it (it gets thrown
* from the parser), then this method is called during the printing
* of the final stack trace, and hence the correct error message
* gets displayed.
*/
public String getMessage() {
if (!specialConstructor) {
return super.getMessage();
}
StringBuffer expected = new StringBuffer();
int maxSize = 0;
for (int i = 0; i < expectedTokenSequences.length; i++) {
if (maxSize < expectedTokenSequences[i].length) {
maxSize = expectedTokenSequences[i].length;
}
for (int j = 0; j < expectedTokenSequences[i].length; j++) {
expected.append(tokenImage[expectedTokenSequences[i][j]]).append(" ");
}
if (expectedTokenSequences[i][expectedTokenSequences[i].length - 1] != 0) {
expected.append("...");
}
expected.append(eol).append(" ");
}
String retval = "Encountered \"";
Token tok = currentToken.next;
for (int i = 0; i < maxSize; i++) {
if (i != 0) retval += " ";
if (tok.kind == 0) {
retval += tokenImage[0];
break;
}
retval += add_escapes(tok.image);
tok = tok.next;
}
retval += "\" at line " + currentToken.next.beginLine + ", column " + currentToken.next.beginColumn;
retval += "." + eol;
if (expectedTokenSequences.length == 1) {
retval += "Was expecting:" + eol + " ";
} else {
retval += "Was expecting one of:" + eol + " ";
}
retval += expected.toString();
return retval;
}
/**
* The end of line string for this machine.
*/
protected String eol = System.getProperty("line.separator", "\n");
/**
* Used to convert raw characters to their escaped version
* when these raw version cannot be used as part of an ASCII
* string literal.
*/
protected String add_escapes(String str) {
StringBuffer retval = new StringBuffer();
char ch;
for (int i = 0; i < str.length(); i++) {
switch (str.charAt(i))
{
case 0 :
continue;
case '\b':
retval.append("\\b");
continue;
case '\t':
retval.append("\\t");
continue;
case '\n':
retval.append("\\n");
continue;
case '\f':
retval.append("\\f");
continue;
case '\r':
retval.append("\\r");
continue;
case '\"':
retval.append("\\\"");
continue;
case '\'':
retval.append("\\\'");
continue;
case '\\':
retval.append("\\\\");
continue;
default:
if ((ch = str.charAt(i)) < 0x20 || ch > 0x7e) {
String s = "0000" + Integer.toString(ch, 16);
retval.append("\\u" + s.substring(s.length() - 4, s.length()));
} else {
retval.append(ch);
}
continue;
}
}
return retval.toString();
}
}

View File

@ -1,439 +0,0 @@
/* Generated By:JavaCC: Do not edit this line. SimpleCharStream.java Version 4.0 */
package org.apache.hadoop.hbase.hql.generated;
/**
* An implementation of interface CharStream, where the stream is assumed to
* contain only ASCII characters (without unicode processing).
*/
public class SimpleCharStream
{
public static final boolean staticFlag = false;
int bufsize;
int available;
int tokenBegin;
public int bufpos = -1;
protected int bufline[];
protected int bufcolumn[];
protected int column = 0;
protected int line = 1;
protected boolean prevCharIsCR = false;
protected boolean prevCharIsLF = false;
protected java.io.Reader inputStream;
protected char[] buffer;
protected int maxNextCharInd = 0;
protected int inBuf = 0;
protected int tabSize = 8;
protected void setTabSize(int i) { tabSize = i; }
protected int getTabSize(int i) { return tabSize; }
protected void ExpandBuff(boolean wrapAround)
{
char[] newbuffer = new char[bufsize + 2048];
int newbufline[] = new int[bufsize + 2048];
int newbufcolumn[] = new int[bufsize + 2048];
try
{
if (wrapAround)
{
System.arraycopy(buffer, tokenBegin, newbuffer, 0, bufsize - tokenBegin);
System.arraycopy(buffer, 0, newbuffer,
bufsize - tokenBegin, bufpos);
buffer = newbuffer;
System.arraycopy(bufline, tokenBegin, newbufline, 0, bufsize - tokenBegin);
System.arraycopy(bufline, 0, newbufline, bufsize - tokenBegin, bufpos);
bufline = newbufline;
System.arraycopy(bufcolumn, tokenBegin, newbufcolumn, 0, bufsize - tokenBegin);
System.arraycopy(bufcolumn, 0, newbufcolumn, bufsize - tokenBegin, bufpos);
bufcolumn = newbufcolumn;
maxNextCharInd = (bufpos += (bufsize - tokenBegin));
}
else
{
System.arraycopy(buffer, tokenBegin, newbuffer, 0, bufsize - tokenBegin);
buffer = newbuffer;
System.arraycopy(bufline, tokenBegin, newbufline, 0, bufsize - tokenBegin);
bufline = newbufline;
System.arraycopy(bufcolumn, tokenBegin, newbufcolumn, 0, bufsize - tokenBegin);
bufcolumn = newbufcolumn;
maxNextCharInd = (bufpos -= tokenBegin);
}
}
catch (Throwable t)
{
throw new Error(t.getMessage());
}
bufsize += 2048;
available = bufsize;
tokenBegin = 0;
}
protected void FillBuff() throws java.io.IOException
{
if (maxNextCharInd == available)
{
if (available == bufsize)
{
if (tokenBegin > 2048)
{
bufpos = maxNextCharInd = 0;
available = tokenBegin;
}
else if (tokenBegin < 0)
bufpos = maxNextCharInd = 0;
else
ExpandBuff(false);
}
else if (available > tokenBegin)
available = bufsize;
else if ((tokenBegin - available) < 2048)
ExpandBuff(true);
else
available = tokenBegin;
}
int i;
try {
if ((i = inputStream.read(buffer, maxNextCharInd,
available - maxNextCharInd)) == -1)
{
inputStream.close();
throw new java.io.IOException();
}
else
maxNextCharInd += i;
return;
}
catch(java.io.IOException e) {
--bufpos;
backup(0);
if (tokenBegin == -1)
tokenBegin = bufpos;
throw e;
}
}
public char BeginToken() throws java.io.IOException
{
tokenBegin = -1;
char c = readChar();
tokenBegin = bufpos;
return c;
}
protected void UpdateLineColumn(char c)
{
column++;
if (prevCharIsLF)
{
prevCharIsLF = false;
line += (column = 1);
}
else if (prevCharIsCR)
{
prevCharIsCR = false;
if (c == '\n')
{
prevCharIsLF = true;
}
else
line += (column = 1);
}
switch (c)
{
case '\r' :
prevCharIsCR = true;
break;
case '\n' :
prevCharIsLF = true;
break;
case '\t' :
column--;
column += (tabSize - (column % tabSize));
break;
default :
break;
}
bufline[bufpos] = line;
bufcolumn[bufpos] = column;
}
public char readChar() throws java.io.IOException
{
if (inBuf > 0)
{
--inBuf;
if (++bufpos == bufsize)
bufpos = 0;
return buffer[bufpos];
}
if (++bufpos >= maxNextCharInd)
FillBuff();
char c = buffer[bufpos];
UpdateLineColumn(c);
return (c);
}
/**
* @deprecated
* @see #getEndColumn
*/
public int getColumn() {
return bufcolumn[bufpos];
}
/**
* @deprecated
* @see #getEndLine
*/
public int getLine() {
return bufline[bufpos];
}
public int getEndColumn() {
return bufcolumn[bufpos];
}
public int getEndLine() {
return bufline[bufpos];
}
public int getBeginColumn() {
return bufcolumn[tokenBegin];
}
public int getBeginLine() {
return bufline[tokenBegin];
}
public void backup(int amount) {
inBuf += amount;
if ((bufpos -= amount) < 0)
bufpos += bufsize;
}
public SimpleCharStream(java.io.Reader dstream, int startline,
int startcolumn, int buffersize)
{
inputStream = dstream;
line = startline;
column = startcolumn - 1;
available = bufsize = buffersize;
buffer = new char[buffersize];
bufline = new int[buffersize];
bufcolumn = new int[buffersize];
}
public SimpleCharStream(java.io.Reader dstream, int startline,
int startcolumn)
{
this(dstream, startline, startcolumn, 4096);
}
public SimpleCharStream(java.io.Reader dstream)
{
this(dstream, 1, 1, 4096);
}
public void ReInit(java.io.Reader dstream, int startline,
int startcolumn, int buffersize)
{
inputStream = dstream;
line = startline;
column = startcolumn - 1;
if (buffer == null || buffersize != buffer.length)
{
available = bufsize = buffersize;
buffer = new char[buffersize];
bufline = new int[buffersize];
bufcolumn = new int[buffersize];
}
prevCharIsLF = prevCharIsCR = false;
tokenBegin = inBuf = maxNextCharInd = 0;
bufpos = -1;
}
public void ReInit(java.io.Reader dstream, int startline,
int startcolumn)
{
ReInit(dstream, startline, startcolumn, 4096);
}
public void ReInit(java.io.Reader dstream)
{
ReInit(dstream, 1, 1, 4096);
}
public SimpleCharStream(java.io.InputStream dstream, String encoding, int startline,
int startcolumn, int buffersize) throws java.io.UnsupportedEncodingException
{
this(encoding == null ? new java.io.InputStreamReader(dstream) : new java.io.InputStreamReader(dstream, encoding), startline, startcolumn, buffersize);
}
public SimpleCharStream(java.io.InputStream dstream, int startline,
int startcolumn, int buffersize)
{
this(new java.io.InputStreamReader(dstream), startline, startcolumn, buffersize);
}
public SimpleCharStream(java.io.InputStream dstream, String encoding, int startline,
int startcolumn) throws java.io.UnsupportedEncodingException
{
this(dstream, encoding, startline, startcolumn, 4096);
}
public SimpleCharStream(java.io.InputStream dstream, int startline,
int startcolumn)
{
this(dstream, startline, startcolumn, 4096);
}
public SimpleCharStream(java.io.InputStream dstream, String encoding) throws java.io.UnsupportedEncodingException
{
this(dstream, encoding, 1, 1, 4096);
}
public SimpleCharStream(java.io.InputStream dstream)
{
this(dstream, 1, 1, 4096);
}
public void ReInit(java.io.InputStream dstream, String encoding, int startline,
int startcolumn, int buffersize) throws java.io.UnsupportedEncodingException
{
ReInit(encoding == null ? new java.io.InputStreamReader(dstream) : new java.io.InputStreamReader(dstream, encoding), startline, startcolumn, buffersize);
}
public void ReInit(java.io.InputStream dstream, int startline,
int startcolumn, int buffersize)
{
ReInit(new java.io.InputStreamReader(dstream), startline, startcolumn, buffersize);
}
public void ReInit(java.io.InputStream dstream, String encoding) throws java.io.UnsupportedEncodingException
{
ReInit(dstream, encoding, 1, 1, 4096);
}
public void ReInit(java.io.InputStream dstream)
{
ReInit(dstream, 1, 1, 4096);
}
public void ReInit(java.io.InputStream dstream, String encoding, int startline,
int startcolumn) throws java.io.UnsupportedEncodingException
{
ReInit(dstream, encoding, startline, startcolumn, 4096);
}
public void ReInit(java.io.InputStream dstream, int startline,
int startcolumn)
{
ReInit(dstream, startline, startcolumn, 4096);
}
public String GetImage()
{
if (bufpos >= tokenBegin)
return new String(buffer, tokenBegin, bufpos - tokenBegin + 1);
else
return new String(buffer, tokenBegin, bufsize - tokenBegin) +
new String(buffer, 0, bufpos + 1);
}
public char[] GetSuffix(int len)
{
char[] ret = new char[len];
if ((bufpos + 1) >= len)
System.arraycopy(buffer, bufpos - len + 1, ret, 0, len);
else
{
System.arraycopy(buffer, bufsize - (len - bufpos - 1), ret, 0,
len - bufpos - 1);
System.arraycopy(buffer, 0, ret, len - bufpos - 1, bufpos + 1);
}
return ret;
}
public void Done()
{
buffer = null;
bufline = null;
bufcolumn = null;
}
/**
* Method to adjust line and column numbers for the start of a token.
*/
public void adjustBeginLineColumn(int newLine, int newCol)
{
int start = tokenBegin;
int len;
if (bufpos >= tokenBegin)
{
len = bufpos - tokenBegin + inBuf + 1;
}
else
{
len = bufsize - tokenBegin + bufpos + 1 + inBuf;
}
int i = 0, j = 0, k = 0;
int nextColDiff = 0, columnDiff = 0;
while (i < len &&
bufline[j = start % bufsize] == bufline[k = ++start % bufsize])
{
bufline[j] = newLine;
nextColDiff = columnDiff + bufcolumn[k] - bufcolumn[j];
bufcolumn[j] = newCol + columnDiff;
columnDiff = nextColDiff;
i++;
}
if (i < len)
{
bufline[j] = newLine++;
bufcolumn[j] = newCol + columnDiff;
while (i++ < len)
{
if (bufline[j = start % bufsize] != bufline[++start % bufsize])
bufline[j] = newLine++;
else
bufline[j] = newLine;
}
}
line = bufline[j];
column = bufcolumn[j];
}
}

View File

@ -1,81 +0,0 @@
/* Generated By:JavaCC: Do not edit this line. Token.java Version 3.0 */
package org.apache.hadoop.hbase.hql.generated;
/**
* Describes the input token stream.
*/
public class Token {
/**
* An integer that describes the kind of this token. This numbering
* system is determined by JavaCCParser, and a table of these numbers is
* stored in the file ...Constants.java.
*/
public int kind;
/**
* beginLine and beginColumn describe the position of the first character
* of this token; endLine and endColumn describe the position of the
* last character of this token.
*/
public int beginLine, beginColumn, endLine, endColumn;
/**
* The string image of the token.
*/
public String image;
/**
* A reference to the next regular (non-special) token from the input
* stream. If this is the last token from the input stream, or if the
* token manager has not read tokens beyond this one, this field is
* set to null. This is true only if this token is also a regular
* token. Otherwise, see below for a description of the contents of
* this field.
*/
public Token next;
/**
* This field is used to access special tokens that occur prior to this
* token, but after the immediately preceding regular (non-special) token.
* If there are no such special tokens, this field is set to null.
* When there are more than one such special token, this field refers
* to the last of these special tokens, which in turn refers to the next
* previous special token through its specialToken field, and so on
* until the first special token (whose specialToken field is null).
* The next fields of special tokens refer to other special tokens that
* immediately follow it (without an intervening regular token). If there
* is no such token, this field is null.
*/
public Token specialToken;
/**
* Returns the image.
*/
public String toString()
{
return image;
}
/**
* Returns a new Token object, by default. However, if you want, you
* can create and return subclass objects based on the value of ofKind.
* Simply add the cases to the switch for all those special cases.
* For example, if you have a subclass of Token called IDToken that
* you want to create if ofKind is ID, simlpy add something like :
*
* case MyParserConstants.ID : return new IDToken();
*
* to the following switch statement. Then you can cast matchedToken
* variable to the appropriate type and use it in your lexical actions.
*/
public static final Token newToken(int ofKind)
{
switch(ofKind)
{
default : return new Token();
}
}
}

View File

@ -1,133 +0,0 @@
/* Generated By:JavaCC: Do not edit this line. TokenMgrError.java Version 3.0 */
package org.apache.hadoop.hbase.hql.generated;
public class TokenMgrError extends Error
{
/*
* Ordinals for various reasons why an Error of this type can be thrown.
*/
/**
* Lexical error occured.
*/
static final int LEXICAL_ERROR = 0;
/**
* An attempt wass made to create a second instance of a static token manager.
*/
static final int STATIC_LEXER_ERROR = 1;
/**
* Tried to change to an invalid lexical state.
*/
static final int INVALID_LEXICAL_STATE = 2;
/**
* Detected (and bailed out of) an infinite loop in the token manager.
*/
static final int LOOP_DETECTED = 3;
/**
* Indicates the reason why the exception is thrown. It will have
* one of the above 4 values.
*/
int errorCode;
/**
* Replaces unprintable characters by their espaced (or unicode escaped)
* equivalents in the given string
*/
protected static final String addEscapes(String str) {
StringBuffer retval = new StringBuffer();
char ch;
for (int i = 0; i < str.length(); i++) {
switch (str.charAt(i))
{
case 0 :
continue;
case '\b':
retval.append("\\b");
continue;
case '\t':
retval.append("\\t");
continue;
case '\n':
retval.append("\\n");
continue;
case '\f':
retval.append("\\f");
continue;
case '\r':
retval.append("\\r");
continue;
case '\"':
retval.append("\\\"");
continue;
case '\'':
retval.append("\\\'");
continue;
case '\\':
retval.append("\\\\");
continue;
default:
if ((ch = str.charAt(i)) < 0x20 || ch > 0x7e) {
String s = "0000" + Integer.toString(ch, 16);
retval.append("\\u" + s.substring(s.length() - 4, s.length()));
} else {
retval.append(ch);
}
continue;
}
}
return retval.toString();
}
/**
* Returns a detailed message for the Error when it is thrown by the
* token manager to indicate a lexical error.
* Parameters :
* EOFSeen : indicates if EOF caused the lexicl error
* curLexState : lexical state in which this error occured
* errorLine : line number when the error occured
* errorColumn : column number when the error occured
* errorAfter : prefix that was seen before this error occured
* curchar : the offending character
* Note: You can customize the lexical error message by modifying this method.
*/
protected static String LexicalError(boolean EOFSeen, int lexState, int errorLine, int errorColumn, String errorAfter, char curChar) {
return("Lexical error at line " +
errorLine + ", column " +
errorColumn + ". Encountered: " +
(EOFSeen ? "<EOF> " : ("\"" + addEscapes(String.valueOf(curChar)) + "\"") + " (" + (int)curChar + "), ") +
"after : \"" + addEscapes(errorAfter) + "\"");
}
/**
* You can also modify the body of this method to customize your error messages.
* For example, cases like LOOP_DETECTED and INVALID_LEXICAL_STATE are not
* of end-users concern, so you can return something like :
*
* "Internal Error : Please file a bug report .... "
*
* from this method for such cases in the release version of your parser.
*/
public String getMessage() {
return super.getMessage();
}
/*
* Constructors of various flavors follow.
*/
public TokenMgrError() {
}
public TokenMgrError(String message, int reason) {
super(message);
errorCode = reason;
}
public TokenMgrError(boolean EOFSeen, int lexState, int errorLine, int errorColumn, String errorAfter, char curChar, int reason) {
this(LexicalError(EOFSeen, lexState, errorLine, errorColumn, errorAfter, curChar), reason);
}
}

View File

@ -133,13 +133,9 @@ ${HBASE_HOME}/bin/start-hbase.sh
</pre>
<p>
Once HBase has started, enter <code>${HBASE_HOME}/bin/hbase shell</code> to obtain a
shell against HBase from which you can execute HQL commands (HQL is a severe subset of SQL).
In the HBase shell, type <code>help;</code> to see a list of supported HQL commands. Note
that all commands in the HBase
shell must end with <code>;</code>. Test your installation by creating, viewing, and dropping
a table, as per the help instructions. Be patient with the <code>create</code> and
<code>drop</code> operations as they may each take 10 seconds or more. To stop HBase, exit the
HBase shell and enter:
shell against HBase from which you can execute commands.
Test your installation by creating, viewing, and dropping
To stop HBase, exit the HBase shell and enter:
</p>
<pre>
${HBASE_HOME}/bin/stop-hbase.sh

View File

@ -43,21 +43,4 @@ log4j.appender.console.layout.ConversionPattern=%d %-5p [%t] %C{2}(%L): %m%n
#log4j.logger.org.apache.hadoop.fs.FSNamesystem=DEBUG
log4j.logger.org.apache.hadoop=WARN
log4j.logger.org.apache.hadoop.hbase.PerformanceEvaluation=WARN
log4j.logger.org.apache.hadoop.hbase.client=DEBUG
log4j.logger.org.apache.hadoop.hbase.filter=INFO
log4j.logger.org.apache.hadoop.hbase.generated=INFO
log4j.logger.org.apache.hadoop.hbase.hql=INFO
log4j.logger.org.apache.hadoop.hbase.io=INFO
log4j.logger.org.apache.hadoop.hbase.ipc=INFO
log4j.logger.org.apache.hadoop.hbase.mapred=INFO
log4j.logger.org.apache.hadoop.hbase.master=DEBUG
log4j.logger.org.apache.hadoop.hbase.regionserver=DEBUG
log4j.logger.org.apache.hadoop.hbase.rest=INFO
log4j.logger.org.apache.hadoop.hbase.thrift=INFO
log4j.logger.org.apache.hadoop.hbase.util=INFO
log4j.logger.org.apache.hadoop.mapred=DEBUG
#log4j.logger.org.apache.hadoop.mapred.JobTracker=DEBUG
#log4j.logger.org.apache.hadoop.mapred.TaskTracker=DEBUG
log4j.logger.org.apache.hadoop.hbase=WARN

View File

@ -9,11 +9,6 @@ Automatically created by Tomcat JspC.
<web-app>
<servlet>
<servlet-name>org.apache.hadoop.hbase.generated.master.hql_jsp</servlet-name>
<servlet-class>org.apache.hadoop.hbase.generated.master.hql_jsp</servlet-class>
</servlet>
<servlet>
<servlet-name>org.apache.hadoop.hbase.generated.master.master_jsp</servlet-name>
<servlet-class>org.apache.hadoop.hbase.generated.master.master_jsp</servlet-class>
@ -24,11 +19,6 @@ Automatically created by Tomcat JspC.
<servlet-class>org.apache.hadoop.hbase.generated.master.table_jsp</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>org.apache.hadoop.hbase.generated.master.hql_jsp</servlet-name>
<url-pattern>/hql.jsp</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>org.apache.hadoop.hbase.generated.master.master_jsp</servlet-name>
<url-pattern>/master.jsp</url-pattern>

View File

@ -1,59 +0,0 @@
<%@ page contentType="text/html;charset=UTF-8"
import="java.util.*"
import="org.apache.hadoop.hbase.HBaseConfiguration"
import="org.apache.hadoop.hbase.hql.TableFormatter"
import="org.apache.hadoop.hbase.hql.ReturnMsg"
import="org.apache.hadoop.hbase.hql.generated.HQLParser"
import="org.apache.hadoop.hbase.hql.Command"
import="org.apache.hadoop.hbase.hql.formatter.HtmlTableFormatter"
%><?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head><meta http-equiv="Content-Type" content="text/html;charset=UTF-8"/>
<title>HQL</title>
<link rel="stylesheet" type="text/css" href="/static/hbase.css" />
</head>
<body>
<a id="logo" href="http://wiki.apache.org/lucene-hadoop/Hbase"><img src="/static/hbase_logo_med.gif" alt="Hbase Logo" title="Hbase Logo" /></a>
<h1 id="page_title"><a href="http://wiki.apache.org/lucene-hadoop/Hbase/HbaseShell">HQL</a></h1>
<p id="links_menu"><a href="/master.jsp">Home</a></p>
<hr id="head_rule" />
<% String query = request.getParameter("q");
if (query == null) {
query = "";
}
%>
<form action="/hql.jsp" method="get">
<p>
<label for="query">Query: </label>
<input type="text" name="q" id="q" size="60" value="<%= query %>" />
<input type="submit" value="submit" />
</p>
</form>
<p>Enter 'help;' -- thats 'help' plus a semi-colon -- for the list of <em>HQL</em> commands.
Data Definition, SHELL, INSERTS, DELETES, and UPDATE commands are disabled in this interface
</p>
<%
if (query.length() > 0) {
%>
<hr/>
<%
HQLParser parser = new HQLParser(query, out, new HtmlTableFormatter(out));
Command cmd = parser.terminatedCommand();
if (cmd.getCommandType() != Command.CommandType.SELECT) {
%>
<p><%= cmd.getCommandType() %>-type commands are disabled in this interface.</p>
<%
} else {
ReturnMsg rm = cmd.execute(new HBaseConfiguration());
String summary = rm == null? "": rm.toString();
%>
<p><%= summary %></p>
<% }
}
%>
</body>
</html>

View File

@ -9,10 +9,6 @@
import="org.apache.hadoop.hbase.HServerInfo"
import="org.apache.hadoop.hbase.HServerAddress"
import="org.apache.hadoop.hbase.HBaseConfiguration"
import="org.apache.hadoop.hbase.hql.ShowCommand"
import="org.apache.hadoop.hbase.hql.TableFormatter"
import="org.apache.hadoop.hbase.hql.ReturnMsg"
import="org.apache.hadoop.hbase.hql.formatter.HtmlTableFormatter"
import="org.apache.hadoop.hbase.HTableDescriptor" %><%
HMaster master = (HMaster)getServletContext().getAttribute(HMaster.MASTER);
HBaseConfiguration conf = master.getConfiguration();
@ -34,7 +30,7 @@
<a id="logo" href="http://wiki.apache.org/lucene-hadoop/Hbase"><img src="/static/hbase_logo_med.gif" alt="HBase Logo" title="HBase Logo" /></a>
<h1 id="page_title">Master: <%=master.getMasterAddress().getHostname()%>:<%=master.getMasterAddress().getPort()%></h1>
<p id="links_menu"><a href="/hql.jsp">HQL</a>, <a href="/logs/">Local logs</a>, <a href="/stacks">Thread Dump</a>, <a href="/logLevel">Log Level</a></p>
<p id="links_menu"><a href="/logs/">Local logs</a>, <a href="/stacks">Thread Dump</a>, <a href="/logLevel">Log Level</a></p>
<hr id="head_rule" />
<h2>Master Attributes</h2>