久久久久久久av_日韩在线中文_看一级毛片视频_日本精品二区_成人深夜福利视频_武道仙尊动漫在线观看

  1. <legend id='M4oVT'><style id='M4oVT'><dir id='M4oVT'><q id='M4oVT'></q></dir></style></legend>
    <i id='M4oVT'><tr id='M4oVT'><dt id='M4oVT'><q id='M4oVT'><span id='M4oVT'><b id='M4oVT'><form id='M4oVT'><ins id='M4oVT'></ins><ul id='M4oVT'></ul><sub id='M4oVT'></sub></form><legend id='M4oVT'></legend><bdo id='M4oVT'><pre id='M4oVT'><center id='M4oVT'></center></pre></bdo></b><th id='M4oVT'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='M4oVT'><tfoot id='M4oVT'></tfoot><dl id='M4oVT'><fieldset id='M4oVT'></fieldset></dl></div>
    • <bdo id='M4oVT'></bdo><ul id='M4oVT'></ul>

    <tfoot id='M4oVT'></tfoot>

    <small id='M4oVT'></small><noframes id='M4oVT'>

      在“GROUP BY"中重用選擇表達式的結果;條款

      reuse the result of a select expression in the quot;GROUP BYquot; clause?(在“GROUP BY中重用選擇表達式的結果;條款?)
    1. <small id='pSPaq'></small><noframes id='pSPaq'>

    2. <legend id='pSPaq'><style id='pSPaq'><dir id='pSPaq'><q id='pSPaq'></q></dir></style></legend>
      <i id='pSPaq'><tr id='pSPaq'><dt id='pSPaq'><q id='pSPaq'><span id='pSPaq'><b id='pSPaq'><form id='pSPaq'><ins id='pSPaq'></ins><ul id='pSPaq'></ul><sub id='pSPaq'></sub></form><legend id='pSPaq'></legend><bdo id='pSPaq'><pre id='pSPaq'><center id='pSPaq'></center></pre></bdo></b><th id='pSPaq'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='pSPaq'><tfoot id='pSPaq'></tfoot><dl id='pSPaq'><fieldset id='pSPaq'></fieldset></dl></div>

        • <tfoot id='pSPaq'></tfoot>
              • <bdo id='pSPaq'></bdo><ul id='pSPaq'></ul>

                  <tbody id='pSPaq'></tbody>
                本文介紹了在“GROUP BY"中重用選擇表達式的結果;條款?的處理方法,對大家解決問題具有一定的參考價值,需要的朋友們下面隨著小編來一起學習吧!

                問題描述

                在 MySQL 中,我可以有這樣的查詢:

                In MySQL, I can have a query like this:

                select  
                    cast(from_unixtime(t.time, '%Y-%m-%d %H:00') as datetime) as timeHour
                    , ... 
                from
                    some_table t 
                group by
                    timeHour, ...
                order by
                    timeHour, ...
                

                其中 GROUP BY 中的 timeHour 是選擇表達式的結果.

                where timeHour in the GROUP BY is the result of a select expression.

                但是我剛剛嘗試了一個類似于 Sqark SQL 中的查詢,我得到了一個錯誤

                But I just tried a query similar to that in Sqark SQL, and I got an error of

                Error: org.apache.spark.sql.AnalysisException: 
                cannot resolve '`timeHour`' given input columns: ...
                

                我對 Spark SQL 的查詢是這樣的:

                My query for Spark SQL was this:

                select  
                      cast(t.unixTime as timestamp) as timeHour
                    , ...
                from
                    another_table as t
                group by
                    timeHour, ...
                order by
                    timeHour, ...
                

                這個結構在 Spark SQL 中可行嗎?

                Is this construct possible in Spark SQL?

                推薦答案

                這個結構在 Spark SQL 中可行嗎?

                Is this construct possible in Spark SQL?

                是的,是.您可以通過兩種方式使其在 Spark SQL 中工作,以在 GROUP BYORDER BY 子句中使用新列

                Yes, It is. You can make it works in Spark SQL in 2 ways to use new column in GROUP BY and ORDER BY clauses

                使用子查詢的方法一:

                SELECT timeHour, someThing FROM (SELECT  
                      from_unixtime((starttime/1000)) AS timeHour
                    , sum(...)                          AS someThing
                    , starttime
                FROM
                    some_table) 
                WHERE
                    starttime >= 1000*unix_timestamp('2017-09-16 00:00:00')
                      AND starttime <= 1000*unix_timestamp('2017-09-16 04:00:00')
                GROUP BY
                    timeHour
                ORDER BY
                    timeHour
                LIMIT 10;
                

                方法 2 使用 WITH//優雅的方式:

                -- create alias 
                WITH table_aliase AS(SELECT  
                      from_unixtime((starttime/1000)) AS timeHour
                    , sum(...)                          AS someThing
                    , starttime
                FROM
                    some_table)
                
                -- use the same alias as table
                SELECT timeHour, someThing FROM table_aliase
                WHERE
                    starttime >= 1000*unix_timestamp('2017-09-16 00:00:00')
                      AND starttime <= 1000*unix_timestamp('2017-09-16 04:00:00')
                GROUP BY
                    timeHour
                ORDER BY
                    timeHour
                LIMIT 10;
                

                在 Scala 中使用 Spark DataFrame(wo SQL) API 的替代方法:

                // This code may need additional import to work well
                
                val df = .... //load the actual table as df
                
                import org.apache.spark.sql.functions._
                
                df.withColumn("timeHour", from_unixtime($"starttime"/1000))
                  .groupBy($"timeHour")
                  .agg(sum("...").as("someThing"))
                  .orderBy($"timeHour")
                  .show()
                
                //another way - as per eliasah comment
                df.groupBy(from_unixtime($"starttime"/1000).as("timeHour"))
                  .agg(sum("...").as("someThing"))
                  .orderBy($"timeHour")
                  .show()
                

                這篇關于在“GROUP BY"中重用選擇表達式的結果;條款?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!

                【網站聲明】本站部分內容來源于互聯網,旨在幫助大家更快的解決問題,如果有圖片或者內容侵犯了您的權益,請聯系我們刪除處理,感謝您的支持!

                相關文檔推薦

                How to use windowing functions efficiently to decide next N number of rows based on N number of previous values(如何有效地使用窗口函數根據 N 個先前值來決定接下來的 N 個行)
                Does ignore option of Pyspark DataFrameWriter jdbc function ignore entire transaction or just offending rows?(Pyspark DataFrameWriter jdbc 函數的 ignore 選項是忽略整個事務還是只是有問題的行?) - IT屋-程序員軟件開發技
                Error while using INSERT INTO table ON DUPLICATE KEY, using a for loop array(使用 INSERT INTO table ON DUPLICATE KEY 時出錯,使用 for 循環數組)
                pyspark mysql jdbc load An error occurred while calling o23.load No suitable driver(pyspark mysql jdbc load 調用 o23.load 時發生錯誤 沒有合適的驅動程序)
                How to integrate Apache Spark with MySQL for reading database tables as a spark dataframe?(如何將 Apache Spark 與 MySQL 集成以將數據庫表作為 Spark 數據幀讀取?)
                In Apache Spark 2.0.0, is it possible to fetch a query from an external database (rather than grab the whole table)?(在 Apache Spark 2.0.0 中,是否可以從外部數據庫獲取查詢(而不是獲取整個表)?) - IT屋-程序員軟件開
                  <tbody id='sWR4t'></tbody>
                • <small id='sWR4t'></small><noframes id='sWR4t'>

                  <tfoot id='sWR4t'></tfoot>
                  • <bdo id='sWR4t'></bdo><ul id='sWR4t'></ul>

                      <i id='sWR4t'><tr id='sWR4t'><dt id='sWR4t'><q id='sWR4t'><span id='sWR4t'><b id='sWR4t'><form id='sWR4t'><ins id='sWR4t'></ins><ul id='sWR4t'></ul><sub id='sWR4t'></sub></form><legend id='sWR4t'></legend><bdo id='sWR4t'><pre id='sWR4t'><center id='sWR4t'></center></pre></bdo></b><th id='sWR4t'></th></span></q></dt></tr></i><div class="qwawimqqmiuu" id='sWR4t'><tfoot id='sWR4t'></tfoot><dl id='sWR4t'><fieldset id='sWR4t'></fieldset></dl></div>
                        <legend id='sWR4t'><style id='sWR4t'><dir id='sWR4t'><q id='sWR4t'></q></dir></style></legend>

                          主站蜘蛛池模板: 中日韩欧美一级片 | 欧美乱做爰xxxⅹ久久久 | 黄色网毛片| 亚洲日韩中文字幕 | 特黄特色大片免费视频观看 | 九九久久精品视频 | 免费不卡视频 | 麻豆va| 国产h视频 | 欧美日韩在线免费 | 亚洲第1页| 午夜影院在线观看免费 | 中文字幕精品一区 | 国产亚洲精品久久19p | 毛片网站在线观看视频 | 中文一区 | 日韩中文在线视频 | 99爱视频| 久久综合影院 | 永久免费av | 亚洲高清av | 免费一级毛片 | 国产超碰人人爽人人做人人爱 | 亚洲国产精品久久久 | 国产精品视频网 | 国产精品久久久久久婷婷天堂 | 国产午夜精品一区二区三区 | 亚洲欧美日韩久久 | 久久精品69 | 久草在线在线精品观看 | 天天躁日日躁xxxxaaaa | 久久不卡日韩美女 | 中文字幕在线免费观看 | 欧美日韩一区二区在线观看 | 色欧美日韩| 91在线精品一区二区 | 国产一区二区在线视频 | 国产精品美女一区二区 | 伊人网站 | 国产高清精品一区二区三区 | 欧美日韩久久 |