Questions tagged [phoenix]

Use this tag for questions about Apache Phoenix, an open source project in the Hadoop ecosystem which provides an SQL layer on top of HBase (a non-relational database). For questions about the Phoenix Framework - an Elixir web framework for building HTML5 apps, APIs and distributed applications - use the tag [phoenix-framework].

Apache Phoenix is a SQL skin over HBase delivered as a client-embedded JDBC driver targeting low latency queries over HBase data. Apache Phoenix takes your SQL query, compiles it into a series of HBase scans, and orchestrates the running of those scans to produce regular JDBC result sets. The table metadata is stored in an HBase table and versioned, such that snapshot queries over prior versions will automatically use the correct schema. Direct use of the HBase API, along with coprocessors and custom filters, results in performance on the order of milliseconds for small queries, or seconds for tens of millions of rows.

References

Related Tags

845 questions
20
votes
1 answer

Storing data in HBase vs Parquet files

I am new to big data and am trying to understand the various ways of persisting and retrieving data. I understand both Parquet and HBase are column oriented storage formats but Parquet is a file oriented storage and not a database unlike HBase. My…
sovan
  • 311
  • 1
  • 3
  • 11
18
votes
3 answers

Can not read large data from phoenix table

Hi All i am getting below error message while running phoenix count query on a large table. 0: jdbc:phoenix:hadoopm1:2181> select Count(*) from PJM_DATASET; +------------+ | COUNT(1) | +------------+ java.lang.RuntimeException:…
user3683741
  • 181
  • 1
  • 5
14
votes
2 answers

Using Phoenix with Cloudera Hbase (installed from repo)

I can get Phoenix working on a standalone Apache Hbase (note, all this is for Hbase 1.0.0 on RHEL6.5) For the Cloudera flavour of Hbase however I never get it working without it throwing Exceptions. (even tried RHEL7 minimal as en OS) The same thing…
Havnar
  • 2,331
  • 6
  • 26
  • 57
14
votes
2 answers

HBase scans are slow

Problem I am trying to build a secondary index with Phoenix. Index creation takes several hours. It seems to be due to slow HBase scans, as I noticed the following performance : I might need 2 hours to scan the table, whereas other developers…
Martin
  • 2,170
  • 1
  • 27
  • 38
13
votes
2 answers

JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')

I have a program that inserts a new patient to HBase in a docker container inside a server. Everything is working fine until I try to change the connection IP to a phoenix query server for running JUnit tests. I am setting the URL in the properties…
randombee
  • 658
  • 1
  • 5
  • 25
13
votes
3 answers

How to export data to text file in Apache phoenix?

I'm quite new to HBase and Phoenix. But is there a way I can dump/export data to a text file? It would be highly appreciable if I can specify the field terminator, such as ',', '|>' etc. Thanks.
dehiker
  • 424
  • 1
  • 7
  • 18
10
votes
1 answer

Write Dataframe to Phoenix

i am trying to write Dataframe to Phoenix table but i am getting exception. Here is my code: df.write.format("org.apache.phoenix.spark").mode(SaveMode.Overwrite).options(collection.immutable.Map( "zkUrl" ->…
ROOT
  • 1,701
  • 3
  • 32
  • 56
9
votes
4 answers

How to export table schemas in apache phoenix?

I'd like to export the schema of an existing table in apache phoenix. Are there some commands or tools to do the same thing as show create table TABLE_NAME in mysql? thx
Weibo Li
  • 3,277
  • 2
  • 22
  • 34
8
votes
1 answer

java.lang.IllegalArgumentException: Unable to PTableType enum for value of 'MATERIALIZED VIEW' Exception with Phoenix and Hbase

I am very much new to Saiku. I am trying to integrate the saiku with phoenix. Phoenix intern connect with HBase. I created a schema and when the Saiku tries to load phoenix schema xml, Am triggered with below error. I am working restless to figure…
venky
  • 370
  • 1
  • 5
  • 18
8
votes
3 answers

Phoenix sqlline cannot display all columns of table on terminal

Use phoenix sqlline to connect the hbase. On SecureCRT terminal I can only see three columns of table which has more than 10 columns. I would like to display all columns of the table to test if data is ok. Is there any configuration should be…
Carl H
  • 353
  • 1
  • 4
  • 18
8
votes
4 answers

Big data with very fast access

I am facing to a problem: database for process plants. There are up to 50,000 sensors at sampling rate of 50 ms. All measured values need to be stored at least 3 years and must support real-time queries (i.e. users can see historical data with delay…
duong_dajgja
  • 3,878
  • 1
  • 28
  • 53
8
votes
1 answer

Google Cloud Bigtable coprocessor support

Google Cloud BigTable doesn't support coprocessors: Coprocessors are not supported. You cannot create classes that implement the interface org.apache.hadoop.hbase.coprocessor. https://cloud.google.com/bigtable/docs/hbase-differences I can…
Sergei Rodionov
  • 3,078
  • 5
  • 21
  • 40
7
votes
2 answers

Tuple {:option, :server_only, :honor_cipher_order} being returned for error reason from OAuth2 package

The Phoenix application I'm supporting has OAuth authentication using two different authentication servers. Mysteriously, only in my development environment they have begun exhibiting unexpected behaviour. The code uses the OAuth2 hex package for…
Keith Pitty
  • 1,068
  • 1
  • 7
  • 20
7
votes
1 answer

HBase Cluster- Can't connect to hbase via phoenix client

I am trying to connect HBase cluster via Phoenix. First, I have copied Phoenix client and query server jars files to HMaster and HRegion lib folder and restarted HBase services. Server - Started the Phoenix server via /bin/queryserver.py. Its…
BASS KARAN
  • 181
  • 6
7
votes
2 answers

Apache Phoenix vs Hive-Spark

What's faster/easier to convert into SQL, that accept SQL scripts as input: Spark SQL which comes as a layer of speed for Hive high latency queries or Phoenix? And if so, how? I need to do a lot of upserts/joining/grouping over the data. [hbase] Is…
kraster
  • 287
  • 1
  • 4
  • 13
1
2 3
56 57