為了使本技術領域的人員更好地理解本說明書中的技術方案,下面將結合本說明書實施例中的圖式,對本說明書實施例中的技術方案進行清楚、完整地描述,顯然,所描述的實施例僅僅是本說明書一部分實施例,而不是全部的實施例。基於本說明書中的實施例,本領域普通技術人員在沒有作出創造性勞動前提下所獲得的所有其他實施例,都應當屬於本說明書保護的範圍。
隨著電腦網路技術的發展,人們可以透過電子地圖瞭解世界各地的地理位置等資訊。通常情況下地圖或者電子地圖是以省份、國家來進行區域的劃分,但是,有些情況下,可能需要將一些指定的區域在地圖上進行合併,如:將中國地圖中的東北三省合併成東北區,以方便用戶對東北區域的進行整體的瞭解認識。
本說明書實施例提供的地圖區域合併的資料處理方法,透過設置統一的透明度,在合併地圖區域時,重合的部分透明度會發生變化,基於地圖圖像中透明度的變化,將指定的區域進行合併。基於像素點透明度的變化,合併指定的地圖區域,方法簡單快捷,不需要複雜的數學計算,合併後的地圖區域可以作為一個整體進行互動,適用性強。
本發明實施例中合併地圖區域的資料處理過程可以在用戶端上進行如:智慧型手機、平板電腦、智慧型可穿戴設備(智慧型手錶、虛擬實境眼鏡、虛擬實境頭盔等)等電子設備。具體可以在用戶端的瀏覽器端進行,如:PC瀏覽器端、行動瀏覽器端、伺服器端web容器等。
具體地,圖1是本說明書提供的一個實施例中的地圖區域合併的資料處理方法的流程示意圖,如圖1所示,本說明書實施例提供的地圖區域合併的資料處理方法,包括:
S2、使用預設透明度的線條繪製待合併區域的地圖圖像,生成初始合併地圖圖像。
待合併區域可以包括多個地圖區域,如:東北三省中的遼寧省、吉林省、黑龍江省三個省份可以表示三個待合併區域,可以在同一個畫布(畫布可以表示用於繪製圖形的組件或區域)或其他地圖繪製區域使用預設透明度的線條繪製出待合併區域的地圖圖像。例如:圖2是本說明書一個實施例中東北地區的初始合併地圖圖像的示意圖,如圖2所示,可以根據東北三省的經緯度資訊,在同一個畫布中按照遼寧省、吉林省、黑龍江省的相對位置分別繪製出遼寧省、吉林省、黑龍江省的地圖圖像,三個省份的地圖圖像共同組成東北地區的初始合併地圖圖像,可以看出初始合併地圖圖像中相鄰的待合併區域之間可能會出現邊界重合。如圖2所示,在繪製待合併區域的地圖圖像時,可以只繪製出各個待合併區域的輪廓線即邊界線,邊界線以內的區域可以表示該待合併區域。
其中,繪製待合併區域的地圖圖像時,本說明書一個實施例中使用預設透明度的線條進行繪製,預設透明度可以根據實際需要進行選取,通常情況下預設透明度可以設置在0-1之間,本說明書一個實施例中預設透明度可以為0.5,這樣設置可以方便後續像素點的檢測。
此外,在繪製各個待合併地圖區域時,不同的待合併地圖區域可以是使用相同顏色的線條繪製,也可以使用不同顏色的線條進行繪製。如:東北三省地圖區域作為待合併區域時,可以在同一個畫布中都使用黑色(或其他顏色如:紅色、藍色等)、透明度為0.5的線條繪製出遼寧省、吉林省、黑龍江省的地圖圖像,將三個省份的地圖圖像的整體作為東北地區的初始合併圖像。也可以使用黑色、透明度為0.5的線條繪製遼寧省的地圖圖像,使用紅色、透明度為0.5的線條繪製吉林省的地圖圖像,使用藍色、透明度為0.5的線條繪製黑龍江省的地圖圖像,將三個省份的地圖圖像的整體作為東北地區的初始合併圖像。即本說明書一個實施例中在繪製待合併區域中各個區域的地圖圖像時,不同區域的地圖圖像可以使用相同的透明度的線條,但是線條顏色可以不進行具體的限定。
在繪製待合併區域的地圖圖像時,可以透過導入GeoJSON資料,接著用程式在畫布上繪製出地圖。GeoJSON是一種對各種地理資料結構進行編碼的格式,是一種地圖資料的組織格式,可以透過解析這種資料繪製出地圖。
S4、獲取所述初始合併地圖圖像中像素點的透明度。
在生成初始合併地圖圖像後,可以獲取初始合併地圖圖像中的各個像素點的透明度。本說明書一個實施例中,在繪製待合併區域的地圖圖像時,可以將待合併區域的經緯度資訊轉化為座標資訊。根據待合併區域的座標資訊,可以遍歷初始合併地圖圖像中待合併區域內的各個像素點,即可以遍歷初始合併地圖圖像中待合併區域內的每個原始資料(包含經緯度資訊的資料點或者座標資訊的資料點)對應的畫布像素點,獲取初始合併地圖圖像中待合併區域內的像素點對應的透明度。可以基於染色技術獲取各個像素點的透明度,具體方法本發明實施例不作具體限定。
透過遍歷待合併區域內部的像素點,獲得各個像素點的透明度變化,方法簡單,並且可以減少待合併區域外部像素點的檢測,提高了資料處理的速度。
S6、將所述像素點的透明度與所述預設透明度相同的像素點作為目標像素點,根據所述目標像素點生成合併後的目標合併地圖圖像。
在獲取到初始合併地圖圖像中像素點的透明度後,可以將像素點的透明度和在繪製待合併區域的地圖圖像時使用的線條的預設透明度進行對比,將透明度與預設透明度相同的像素點作為目標像素點。例如:在繪製待合併區域的地圖圖像時使用的預設透明度為0.5,則可以將初始合併地圖圖像中透明度為0.5的像素點作為目標像素點。利用目標像素點可以生成合併後的目標合併地圖圖像,完成地圖區域的合併。
本說明書一個實施例中,可以提取出目標像素點的座標資訊,將目標像素點對應的座標資訊導出,利用目標像素點對應的座標資訊的集合生成目標合併地圖圖像。如:若在繪製待合併區域的地圖圖像時使用的預設透明度為0.5,可以將初始合併地圖圖像中透明度為0.5的像素點作為目標像素點,提取目標像素點的座標資訊,可以將目標像素點的座標資訊保存在座標點集合中,可以將所有目標像素點的座標資訊組成的座標點集合導出。根據目標像素點的座標集合繪製出目標合成地圖圖像,目標合成地圖圖像可以由待合併區域的邊界圖像組成,此時生成的待合併區域的邊界圖像中可以不包括待合併區域的邊界重疊的部分。
例如:本說明書一個實施例可以在畫布或者地圖繪製區域使用黑色、透明度為0.5的線條繪製出待合併區域的地圖圖像,待合併區域的地圖圖像中可以包括待合併區域的邊界圖像,待合併區域的地圖圖像可以組成初始合併地圖圖像,具體可以參考圖2中東北地圖的初始合併地圖圖像示意圖。遍歷初始合併地圖圖像中待合併區域內的像素點,待合併區域的邊界處未重疊的部分像素點的透明度為0.5,邊界重疊的部分像素點的透明度通常大於0.5,邊界圖像內部的其他區域因未繪製圖像內容,像素點的透明度為0。可以將透明度為0.5的像素點作為目標像素點,即將邊界圖像中未重疊的部分的像素點作為目標像素點。目標像素點組合在一起,可以表示合併後的待合併區域的邊界圖像,此時,合併後的邊界圖像中不包括各個待合併區域的重疊部分,可以表示各待合併區域整體的邊界輪廓。可以提取並保存目標像素點的座標資訊,將目標像素點對應的座標資訊導出可以生成合併後的目標合併地圖圖像。圖3是本說明書一個實施例中合併後的東北地區的目標合併地圖圖像示意圖,如圖3所示,本說明書一個實施例中合併後的目標地圖圖像可以將各待合併區域的邊界重疊部分去除,只保留邊界未重疊部分,直觀的表示地圖區域合併的效果,方便用戶查看。
圖4(a)-4(b)是本說明書一個實施例中透明度變化檢測示意圖,如圖4(a)所示,圖中將兩個透明度為0.5的圖像進行部分疊加,從圖中可以看出中間疊加部分的圖像的透明度大於其他未疊加部分的圖像的透明度。同樣的,如圖4(b)所示,將兩個沒有邊框的透明度為0.5的圖像的進行部分疊加,可以看出,中間重疊部分的圖像的透明度大於其他未疊加部分的圖像的透明度。本說明書實施例中,透過檢測像素點的透明度的變化,可以準確快速的檢測出待合併區域中哪些部分發生重疊,哪些部分未重疊,以實現快速準確的生成合併後的地圖圖像。
此外,本發明實施例還可以根據待合併區域所處的地理位置,為合併後的目標合併地圖圖像進行命名,例如:圖3中將合併後的東北三省命名為東北區。
本說明書提供的地圖區域合併的資料處理方法,可以透過畫布上染色技術檢測像素點的透明度是否有變化,基於地圖合併時重疊部分的像素點的透明度與未重疊部分的像素點的透明度不同,篩選出邊界重疊的像素點和邊界未重疊的像素點,進一步基於邊界未重疊的像素點生成合併後的地圖圖像。方法簡單快捷,不需要複雜的資料處理,並且能夠準確的檢測出邊界重疊部分,使得地圖區域的合併更加準確快速,合併後的地圖區域可以作為一個整體進行互動,方便後續使用,適用性廣。
在上述實施例的基礎上,本說明書一個實施例中,所述使用預設透明度的線條繪製待合併區域的地圖圖像,生成初始合併地圖圖像,可以包括:
在第一地圖繪製區域獲取用戶選擇的所述待合併區域;
根據用戶選擇的所述待合併區域,在第二地圖繪製區域使用所述預設透明度的線條繪製待合併區域的地圖圖像,生成所述初始合併地圖圖像。
具體地,用戶在第一地圖繪製區域(如某用戶端的畫布中)中查看地圖時,若需要將部分地圖區域進行合併,如需要將東北三省的地圖區域進行合併,則用戶可以透過點擊或其他操作選擇需要合併的待合併區域。如:用戶在第一地圖繪製區域中透過導入GeoJSON資料繪製出完整的地圖(如繪製出中國地圖),並透過點擊繪製出的中國地圖中的東北三省遼寧省、吉林省、黑龍江省,選擇出待合併區域。識別出用戶選擇的待合併區域後,可以在第二地圖繪製區域(第二地圖繪製區域可以使用隱藏畫布)使用預設透明度的線條繪製出待合併區域的地圖圖像,生成初始合併地圖圖像,具體可以參考上述實施例生成初始合併地圖圖像,此處不再贅述。基於用戶的選擇生成初始合併地圖圖像,用戶可以根據實際需要選擇待合併區域,進行地圖區域的合併,方法簡單,靈活,提高了用戶體驗。
在上述實施例的基礎上,本說明書一個實施例中,所述根據所述目標像素點生成合併後的目標合併地圖圖像,可以包括:
將所述初始合併地圖圖像中的所述目標像素點之外的像素點從所述初始合併地圖圖像中剔除;
將所述初始合併地圖圖像中所述目標像素點組成的地圖圖像作為所述目標合併地圖圖像。
具體地,在確定出目標像素點,基於目標像素點生成目標合併地圖圖像時,可以將初始合併地圖圖像中的非目標像素點(即除去目標像素點之外的像素點)從初始合併地圖圖像中剔除,此時,初始合併地圖圖像中只剩下目標像素點,可以將剩下的目標像素點合併組成合併後的目標合併地圖圖像。如:上述實施例中在第二地圖繪製區域中將非目標像素點剔除,保留目標像素點,則第二地圖繪製區域中剩餘的目標像素點組成的圖像即可以表示合併後的目標合併地圖圖像。
將透明度不符合要求的非目標像素點從初始合併地圖圖像中剔除,剩餘的目標像素點直接生成合併後的目標合併地圖圖像,方法簡單,地圖區域合併準確。
在剔除非目標像素點後,可以將非目標像素點的位置處的顏色設置為與初始合併地圖圖像中邊界內部區域(即待合併區域內部區域)的像素點的顏色相同。這樣可以避免剔除非目標像素點後,非目標像素點的顏色與待合併區域邊界內部其他區域的顏色不同,影響合併後地圖圖像的顯示效果。如:若在繪製待合併區域的地圖圖像生成初始合併地圖圖像時,初始合併地圖圖像具有底色如:待合併區域的邊界內部區域使用紅色像素點填充,則在剔除非目標像素點後,非目標像素點位置處的顏色可能會變成白色或無色,與待合併區域的邊界內部區域其他像素點的顏色不同,影響合併地圖圖像的顯示效果。可以在剔除非目標像素點後,將非目標像素點處的顏色設置為紅色,與待合併區域的邊界內部區域的顏色保持一致,提高合併後地圖圖像的顯示效果。若待合併區域中各個區域邊界內部使用不同的顏色進行填充,即初始合併地圖圖像中非邊界處有多種顏色,則可以取非目標像素點臨近的任意一個像素點的顏色作為該非目標像素點剔除後該位置處的顏色。
本說明書提供的地圖區域合併的資料處理方法,透過設置統一的透明度,在合併地圖區域時,重合的部分透明度會發生變化,基於地圖圖像中透明度的變化,將指定的區域進行合併。基於像素點透明度的變化,合併指定的地圖區域,方法簡單快捷,不需要複雜的數學計算,合併後的地圖區域可以作為一個整體進行互動,適用性強。
本說明書中上述方法的各個實施例均採用遞進的方式描述,各個實施例之間相同相似的部分互相參見即可,每個實施例重點說明的都是與其他實施例的不同之處。相關之處參見方法實施例的部分說明即可。
基於上述所述的地圖區域合併的資料處理方法,本說明書一個或多個實施例還提供一種地圖區域合併的資料處理裝置。所述的裝置可以包括使用了本說明書實施例所述方法的系統(包括分布式系統)、軟體(應用)、模組、組件、伺服器、用戶端等並結合必要的實施硬體的裝置。基於同一創新構思,本說明書實施例提供的一個或多個實施例中的裝置如下面的實施例所述。由於裝置解決問題的實現方案與方法相似,因此本說明書實施例具體的裝置的實施可以參見前述方法的實施,重複之處不再贅述。以下所使用的,術語“單元”或者“模組”可以實現預定功能的軟體和/或硬體的組合。儘管以下實施例所描述的裝置較佳地以軟體來實現,但是硬體,或者軟體和硬體的組合的實現也是可能並被構想的。
具體地,圖5是本說明書提供的地圖區域合併的資料處理裝置一個實施例的模組結構示意圖,如圖5所示,本說明書中提供的地圖區域合併的資料處理裝置包括:初始合併圖像繪製模組51、透明度獲取模組52、目標合併地圖生成模組53,其中:
初始合併圖像繪製模組51,可以用於使用預設透明度的線條繪製待合併區域的地圖圖像,生成初始合併地圖圖像;
透明度獲取模組52,可以用於獲取所述初始合併地圖圖像中像素點的透明度;
目標合併地圖生成模組53,可以用於將所述像素點的透明度與所述預設透明度相同的像素點作為目標像素點,根據所述目標像素點生成合併後的目標合併地圖圖像。
本說明書實施例提供的地圖區域合併的資料處理裝置,可以透過畫布上染色技術檢測像素點的透明度是否有變化,基於地圖和合併時重疊部分的像素點的透明度與未重疊部分的像素點的透明度不同,篩選出邊界重疊的像素點和邊界未重疊的像素點,進一步基於邊界未重疊的像素點生成合併後的地圖圖像。方法簡單快捷,不需要複雜的資料處理,並且能夠準確的檢測出邊界重疊部分,使得地圖區域的合併更加準確快速,合併後的地圖區域可以作為一個整體進行互動,方便後續使用,適用性廣。
在上述實施例的基礎上,所述初始合併圖像繪製模組具體用於:
在第一地圖繪製區域獲取用戶選擇的所述待合併區域;
根據用戶選擇的所述待合併區域,在第二地圖繪製區域使用所述預設透明度的線條繪製待合併區域的地圖圖像,生成所述初始合併地圖圖像。
本說明書實施例,所述初始合併圖像繪製模組具體用於:
在第一地圖繪製區域獲取用戶選擇的所述待合併區域;
根據用戶選擇的所述待合併區域,在第二地圖繪製區域使用所述預設透明度的線條繪製待合併區域的地圖圖像,生成所述初始合併地圖圖像。
在上述實施例的基礎上,所述目標合併地圖生成模組具體用於:
將所述初始合併地圖圖像中的所述目標像素點之外的像素點從所述初始合併地圖圖像中剔除;
將所述初始合併地圖圖像中所述目標像素點組成的地圖圖像作為所述目標合併地圖圖像。
本說明書實施例,將透明度不符合要求的非目標像素點從初始合併地圖圖像中剔除,剩餘的目標像素點直接生成合併後的目標合併地圖圖像,方法簡單,地圖區域合併準確。
在上述實施例的基礎上,所述目標合併地圖生成模組還用於:
將所述目標像素點之外的像素點的位置處的顏色設置為與所述初始合併地圖圖像中所述待合併區域內部的像素點的顏色相同。
本說明書實施例,在剔除非目標像素點後,將非目標像素點處的顏色設置為紅色,與待合併區域的邊界內部區域的顏色保持一致,提高合併後地圖圖像的顯示效果。
在上述實施例的基礎上,所述目標合併地圖生成模組具體用於:
提取所述目標像素點對應的座標資訊,根據所述目標像素點對應的座標資訊的集合,生成所述目標合併地圖圖像。
本說明書實施例,基於目標像素點的座標資訊的集合生成目標合併地圖圖像,方法快捷,不需要複雜的資料處理,並且能夠準確的檢測出邊界重疊部分,使得地圖區域的合併更加準確快速,合併後的地圖區域可以作為一個整體進行互動,方便後續使用,適用性廣。
在上述實施例的基礎上,所述透明度獲取模組具體用於:
根據所述待合併區域的座標資訊,遍歷所述初始合併地圖圖像中所述待合併區域內的像素點,獲取所述初始合併地圖圖像中所述待合併區域內的像素點對應的透明度。
本說明書實施例,透過遍歷待合併區域內部的像素點,獲得各個像素點的透明度變化,方法簡單,並且可以減少待合併區域外部像素點的檢測,提高了資料處理的速度。
需要說明書的是,上述所述的裝置根據方法實施例的描述還可以包括其他的實施方式。具體的實現方式可以參照相關方法實施例的描述,在此不作一一贅述。
本說明書一個實施例中,還可以提供一種電腦儲存媒體,其上儲存有電腦程式,所述電腦程式被執行時,實現上述實施例中視頻資料的處理方法,例如可以實現如下方法:
使用預設透明度的線條繪製待合併區域的地圖圖像,生成初始合併地圖圖像;
獲取所述初始合併地圖圖像中像素點的透明度;
將所述像素點的透明度與所述預設透明度相同的像素點作為目標像素點,根據所述目標像素點生成合併後的目標合併地圖圖像。
上述對本說明書特定實施例進行了描述。其它實施例在所附申請專利範圍的範圍內。在一些情況下,在申請專利範圍中記載的動作或步驟可以按照不同於實施例中的順序來執行並且仍然可以實現期望的結果。另外,在圖式中描繪的過程不一定要求顯示的特定順序或者連續順序才能實現期望的結果。在某些實施方式中,多任務處理和並行處理也是可以的或者可能是有利的。
本說明書提供的上述實施例所述的方法或裝置可以透過電腦程式實現業務邏輯並記錄在儲存媒體上,所述的儲存媒體可以電腦讀取並執行,實現本說明書實施例所描述方案的效果。
本說明書實施例提供的上述地圖區域合併的資料處理方法或裝置可以在電腦中由處理器執行相應的程式指令來實現,如使用windows作業系統的c++語言在PC端實現、linux系統實現,或其他例如使用android、iOS系統程式設計語言在智慧型終端實現,以及基於量子電腦的處理邏輯實現等。本說明書提供的一種地圖區域合併的資料處理系統的一個實施例中,圖6是本說明書提供的一種地圖區域合併的資料處理系統實施例的模組結構示意圖,如圖6所示,本說明書實施例提供的地圖區域合併的資料處理系統可以包括處理器61以及用於儲存處理器可執行指令的儲存器62,
處理器61和儲存器62透過匯流排63完成相互間的通訊;
所述處理器61用於呼叫所述儲存器62中的程式指令,以執行上述各地震資料處理方法實施例所提供的方法,例如包括:使用預設透明度的線條繪製待合併區域的地圖圖像,生成初始合併地圖圖像;獲取所述初始合併地圖圖像中像素點的透明度;將所述像素點的透明度與所述預設透明度相同的像素點作為目標像素點,根據所述目標像素點生成合併後的目標合併地圖圖像。
需要說明的是說明書上述所述的裝置、電腦儲存媒體、系統根據相關方法實施例的描述還可以包括其他的實施方式,具體的實現方式可以參照方法實施例的描述,在此不作一一贅述。
本說明書中的各個實施例均採用遞進的方式描述,各個實施例之間相同相似的部分互相參見即可,每個實施例重點說明的都是與其他實施例的不同之處。尤其,對於硬體+程式類實施例而言,由於其基本相似於方法實施例,所以描述的比較簡單,相關之處參見方法實施例的部分說明即可。
本說明書實施例並不局限於必須是符合行業通訊標準、標準電腦資料處理和資料儲存規則或本說明書一個或多個實施例所描述的情況。某些行業標準或者使用自定義方式或實施例描述的實施基礎上略加修改後的實施方案也可以實現上述實施例相同、等同或相近、或變形後可預料的實施效果。應用這些修改或變形後的資料獲取、儲存、判斷、處理方式等獲取的實施例,仍然可以屬於本說明書實施例的可選實施方案範圍之內。
在20世紀90年代,對於一個技術的改進可以很明顯地區分是硬體上的改進(例如,對二極體、電晶體、開關等電路結構的改進)還是軟體上的改進(對於方法流程的改進)。然而,隨著技術的發展,當今的很多方法流程的改進已經可以視為硬體電路結構的直接改進。設計人員幾乎都透過將改進的方法流程程式化到硬體電路中來得到相應的硬體電路結構。因此,不能說一個方法流程的改進就不能用硬體實體模組來實現。例如,可程式化邏輯裝置(Programmable Logic Device, PLD)(例如現場可程式化閘陣列(Field Programmable Gate Array,FPGA))就是這樣一種積體電路,其邏輯功能由用戶對裝置程式化來確定。由設計人員自行程式化來把一個數位系統“整合”在一片PLD上,而不需要請晶片製造廠商來設計和製作專用的積體電路晶片。而且,如今,取代手工地製作積體電路晶片,這種程式化也多半改用“邏輯編譯器(logic compiler)”軟體來實現,它與程式開發撰寫時所用的軟體編譯器相類似,而要編譯之前的原始代碼也得用特定的程式化語言來撰寫,此稱之為硬體描述語言(Hardware Description Language,HDL),而HDL也並非僅有一種,而是有許多種,如ABEL (Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL (Cornell University Programming Language)、HDCal、JHDL(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language)等,目前最普遍使用的是VHDL(Very-High-Speed Integrated Circuit Hardware Description Language)與Verilog。本領域技術人員也應該清楚,只需要將方法流程用上述幾種硬體描述語言稍作邏輯程式化並程式化到積體電路中,就可以很容易得到實現該邏輯方法流程的硬體電路。
控制器可以按任何適當的方式實現,例如,控制器可以採取例如微處理器或處理器以及儲存可由該(微)處理器執行的電腦可讀程式代碼(例如軟體或韌體)的電腦可讀媒體、邏輯閘、開關、特殊應用積體電路(Application
Specific Integrated Circuit,ASIC)、可程式化邏輯控制器和嵌入微控制器的形式,控制器的例子包括但不限於以下微控制器:ARC 625D、Atmel AT91SAM、Microchip
PIC18F26K20以及Silicone Labs C8051F320,儲存器控制器還可以被實現為儲存器的控制邏輯的一部分。本領域技術人員也知道,除了以純電腦可讀程式代碼方式實現控制器以外,完全可以透過將方法步驟進行邏輯程式化來使得控制器以邏輯閘、開關、特殊應用積體電路、可程式化邏輯控制器和嵌入微控制器等的形式來實現相同功能。因此這種控制器可以被認為是一種硬體部件,而對其內包括的用於實現各種功能的裝置也可以視為硬體部件內的結構。或者甚至,可以將用於實現各種功能的裝置視為既可以是實現方法的軟體模組又可以是硬體部件內的結構。
上述實施例闡明的系統、裝置、模組或單元,具體可以由電腦晶片或實體實現,或者由具有某種功能的產品來實現。一種典型的實現設備為電腦。具體的,電腦例如可以為個人電腦、膝上型電腦、車載人機互動設備、蜂巢式電話、相機電話、智慧型手機、個人數位助理、媒體播放器、導航設備、電子郵件設備、遊戲控制台、平板電腦、可穿戴設備或者這些設備中的任何設備的組合。
雖然本說明書一個或多個實施例提供了如實施例或流程圖所述的方法操作步驟,但基於常規或者無創造性的手段可以包括更多或者更少的操作步驟。實施例中列舉的步驟順序僅僅為眾多步驟執行順序中的一種方式,不代表唯一的執行順序。在實際中的裝置或終端產品執行時,可以按照實施例或者圖式所示的方法順序執行或者並行執行(例如平行處理器或者多執行緒處理的環境,甚至為分布式資料處理環境)。術語“包括”、“包含”或者其任何其他變體意在涵蓋非排他性的包含,從而使得包括一系列要素的過程、方法、產品或者設備不僅包括那些要素,而且還包括沒有明確列出的其他要素,或者是還包括為這種過程、方法、產品或者設備所固有的要素。在沒有更多限制的情況下,並不排除在包括所述要素的過程、方法、產品或者設備中還存在另外的相同或等同要素。第一,第二等詞語用來表示名稱,而並不表示任何特定的順序。
為了描述的方便,描述以上裝置時以功能分為各種模組分別描述。當然,在實施本說明書一個或多個時可以把各模組的功能在同一個或多個軟體和/或硬體中實現,也可以將實現同一功能的模組由多個子模組或子單元的組合實現等。以上所描述的裝置實施例僅僅是示意性的,例如,所述單元的劃分,僅僅為一種邏輯功能劃分,實際實現時可以有另外的劃分方式,例如多個單元或組件可以結合或者可以集成到另一個系統,或一些特徵可以忽略,或不執行。另一點,所顯示或討論的相互之間的耦接或直接耦接或通訊連接可以是透過一些介面,裝置或單元的間接耦接或通訊連接,可以是電性,機械或其它的形式。
本發明是參照根據本發明實施例的方法、裝置(系統)、和電腦程式產品的流程圖和/或方塊圖來描述的。應理解可由電腦程式指令實現流程圖和/或方塊圖中的每一流程和/或方塊、以及流程圖和/或方塊圖中的流程和/或方塊的結合。可提供這些電腦程式指令到通用電腦、專用電腦、嵌入式處理機或其他可程式化資料處理設備的處理器以產生一個機器,使得透過電腦或其他可程式化資料處理設備的處理器執行的指令產生用於實現在流程圖一個流程或多個流程和/或方塊圖一個方塊或多個方塊中指定的功能的裝置。
這些電腦程式指令也可儲存在能引導電腦或其他可程式化資料處理設備以特定方式工作的電腦可讀儲存器中,使得儲存在該電腦可讀儲存器中的指令產生包括指令裝置的製造品,該指令裝置實現在流程圖一個流程或多個流程和/或方塊圖一個方塊或多個方塊中指定的功能。
這些電腦程式指令也可裝載到電腦或其他可程式化資料處理設備上,使得在電腦或其他可程式化設備上執行一系列操作步驟以產生電腦實現的處理,從而在電腦或其他可程式化設備上執行的指令提供用於實現在流程圖一個流程或多個流程和/或方塊圖一個方塊或多個方塊中指定的功能的步驟。
在一個典型的配置中,計算設備包括一個或多個處理器(CPU)、輸入/輸出介面、網路介面和記憶體。
記憶體可能包括電腦可讀媒體中的非永久性記憶體,隨機存取記憶體(RAM)和/或非揮發性記憶體等形式,如唯讀記憶體(ROM)或快閃記憶體(flash RAM)。記憶體是電腦可讀媒體的範例。
電腦可讀媒體包括永久性和非永久性、可移動和非可移動媒體可以由任何方法或技術來實現資訊儲存。資訊可以是電腦可讀指令、資料結構、程式的模組或其他資料。電腦的儲存媒體的例子包括,但不限於相變記憶體(PRAM)、靜態隨機存取記憶體(SRAM)、動態隨機存取記憶體(DRAM)、其他類型的隨機存取記憶體(RAM)、唯讀記憶體(ROM)、電可抹除可程式化唯讀記憶體(EEPROM)、快閃記憶體或其他記憶體技術、唯讀光碟唯讀記憶體(CD-ROM)、數位多功能光碟(DVD)或其他光學儲存、磁盒式磁帶,磁帶磁磁碟儲存、石墨烯儲存或其他磁性儲存設備或任何其他非傳輸媒體,可用於儲存可以被計算設備存取的資訊。按照本文中的界定,電腦可讀媒體不包括暫態電腦可讀媒體(transitory media),如調變的資料訊號和載波。
本領域技術人員應明白,本說明書一個或多個實施例可提供為方法、系統或電腦程式產品。因此,本說明書一個或多個實施例可採用完全硬體實施例、完全軟體實施例或結合軟體和硬體態樣的實施例的形式。而且,本說明書一個或多個實施例可採用在一個或多個其中包含有電腦可用程式代碼的電腦可用儲存媒體(包括但不限於磁碟儲存器、CD-ROM、光學儲存器等)上實施的電腦程式產品的形式。
本說明書一個或多個實施例可以在由電腦執行的電腦可執行指令的一般上下文中描述,例如程式模組。一般地,程式模組包括執行特定任務或實現特定抽象資料類型的例程、程式、物件、組件、資料結構等等。也可以在分布式計算環境中實踐本說明書一個或多個實施例,在這些分布式計算環境中,由透過通訊網路而被連接的遠端處理設備來執行任務。在分布式計算環境中,程式模組可以位於包括儲存設備在內的本地和遠端電腦儲存媒體中。
本說明書中的各個實施例均採用遞進的方式描述,各個實施例之間相同相似的部分互相參見即可,每個實施例重點說明的都是與其他實施例的不同之處。尤其,對於系統實施例而言,由於其基本相似於方法實施例,所以描述的比較簡單,相關之處參見方法實施例的部分說明即可。在本說明書的描述中,參考術語“一個實施例”、“一些實施例”、“範例”、“具體範例”、或“一些範例”等的描述意指結合該實施例或範例描述的具體特徵、結構、材料或者特點包含於本說明書的至少一個實施例或範例中。在本說明書中,對上述術語的示意性表述不必須針對的是相同的實施例或範例。而且,描述的具體特徵、結構、材料或者特點可以在任一個或多個實施例或範例中以合適的方式結合。此外,在不相互矛盾的情況下,本領域的技術人員可以將本說明書中描述的不同實施例或範例以及不同實施例或範例的特徵進行結合和組合。
以上所述僅為本說明書一個或多個實施例的實施例而已,並不用於限制本說明書一個或多個實施例。對於本領域技術人員來說,本說明書一個或多個實施例可以有各種更改和變化。凡在本說明書的精神和原理之內所作的任何修改、等同替換、改進等,均應包含在申請專利範圍的範圍之內。In order to enable those skilled in the art to better understand the technical solutions in this specification, the following will clearly and completely describe the technical solutions in the embodiments of this specification in conjunction with the drawings in the embodiments of this specification. Obviously, the described The embodiments are only a part of the embodiments in this specification, rather than all the embodiments. Based on the embodiments in this specification, all other embodiments obtained by those of ordinary skill in the art without creative work should fall within the protection scope of this specification.
With the development of computer network technology, people can understand the geographic location and other information around the world through electronic maps. Under normal circumstances, maps or electronic maps are divided into provinces and countries. However, in some cases, some designated areas may need to be merged on the map, such as combining the three northeastern provinces on the map of China into the Northeast. , In order to facilitate users to have an overall understanding of the Northeast region.
The data processing method for merging map areas provided by the embodiments of this specification, by setting a uniform transparency, when merging the map areas, the overlapped part transparency will change, and based on the change of transparency in the map image, the designated areas are merged. Based on the change of pixel transparency, the method of merging the specified map area is simple and fast, and does not require complicated mathematical calculations. The merged map area can be interacted as a whole and has strong applicability.
In the embodiment of the present invention, the data processing process of the merged map area can be performed on the user terminal, such as: smart phones, tablet computers, smart wearable devices (smart watches, virtual reality glasses, virtual reality helmets, etc.), etc. equipment. Specifically, it can be performed on the browser side of the user side, such as: PC browser side, mobile browser side, server side web container, etc.
Specifically, FIG. 1 is a schematic flowchart of a data processing method for map area merging in an embodiment provided in this specification. As shown in FIG. 1, the data processing method for map area merging provided in this embodiment of this specification includes:
S2. Draw a map image of the area to be merged using a line with preset transparency to generate an initial merged map image.
The area to be merged can include multiple map areas. For example, the three provinces of Liaoning, Jilin and Heilongjiang in the three northeastern provinces can represent three areas to be merged, and they can be on the same canvas (canvas can represent components used to draw graphics Or area) or other map drawing area draws a map image of the area to be merged using lines with preset transparency. For example: Figure 2 is a schematic diagram of the initial merged map image of the Northeast region in an embodiment of this specification. As shown in Figure 2, according to the latitude and longitude information of the three northeastern provinces, in the same canvas, the Map images of Liaoning Province, Jilin Province, and Heilongjiang Province are drawn from the relative positions of the three provinces. The map images of the three provinces together form the initial merged map image of the northeast region. Boundaries may overlap between merged areas. As shown in FIG. 2, when drawing the map image of the area to be merged, only the contour line of each area to be merged, that is, the boundary line, may be drawn, and the area within the boundary line may represent the area to be merged.
Wherein, when drawing the map image of the area to be merged, in one embodiment of this specification, lines with preset transparency are used for drawing. The preset transparency can be selected according to actual needs. Normally, the preset transparency can be set between 0-1 Meanwhile, in an embodiment of this specification, the preset transparency may be 0.5, and this setting can facilitate subsequent pixel detection.
In addition, when drawing each map area to be merged, different map areas to be merged may be drawn using lines of the same color, or lines of different colors may be used for drawing. For example, when the map area of the three northeastern provinces is used as the area to be merged, you can use black (or other colors such as red, blue, etc.) and lines with a transparency of 0.5 to draw the lines of Liaoning, Jilin, and Heilongjiang on the same canvas. For the map image, the entire map image of the three provinces is taken as the initial merged image of the northeast region. You can also use black lines with a transparency of 0.5 to draw a map image of Liaoning Province, use red lines with a transparency of 0.5 to draw a map image of Jilin Province, and use a blue line with a transparency of 0.5 to draw a map image of Heilongjiang Province , The entire map image of the three provinces is taken as the initial merged image of the northeast region. That is, when drawing the map image of each area in the area to be merged in an embodiment of this specification, the map images of different areas may use lines with the same transparency, but the color of the lines may not be specifically limited.
When drawing the map image of the area to be merged, you can import GeoJSON data, and then use the program to draw the map on the canvas. GeoJSON is a format for encoding various geographic data structures and an organization format for map data. Maps can be drawn by parsing this data.
S4. Obtain the transparency of pixels in the initial merged map image.
After the initial merged map image is generated, the transparency of each pixel in the initial merged map image can be obtained. In one embodiment of this specification, when drawing a map image of the area to be merged, the latitude and longitude information of the area to be merged can be converted into coordinate information. According to the coordinate information of the area to be merged, it is possible to traverse each pixel in the area to be merged in the initial merged map image, that is, to traverse each original data (data point containing the latitude and longitude information) Or the data points of the coordinate information) corresponding to the canvas pixels, and the transparency corresponding to the pixels in the area to be merged in the initial merged map image is obtained. The transparency of each pixel can be obtained based on the dyeing technology, and the specific method is not specifically limited in the embodiment of the present invention.
By traversing the pixels inside the area to be merged, the transparency change of each pixel is obtained. The method is simple, and the detection of pixels outside the area to be merged can be reduced, and the data processing speed is improved.
S6. Use a pixel with the same transparency of the pixel as the preset transparency as a target pixel, and generate a merged target merged map image according to the target pixel.
After obtaining the transparency of the pixels in the initial merged map image, the transparency of the pixels can be compared with the preset transparency of the lines used when drawing the map image of the area to be merged, and the transparency is the same as the preset transparency. The pixel is regarded as the target pixel. For example, if the preset transparency used when drawing the map image of the area to be merged is 0.5, then the pixel with the transparency 0.5 in the initial merged map image can be used as the target pixel. The target pixels can be used to generate a merged target merged map image to complete the merge of the map area.
In one embodiment of this specification, the coordinate information of the target pixel can be extracted, the coordinate information corresponding to the target pixel can be derived, and the target merged map image can be generated by using the collection of the coordinate information corresponding to the target pixel. For example, if the default transparency used when drawing the map image of the area to be merged is 0.5, the pixel with 0.5 transparency in the initial merged map image can be used as the target pixel, and the coordinate information of the target pixel can be extracted. The coordinate information of the target pixel is stored in the coordinate point set, and the coordinate point set composed of the coordinate information of all target pixels can be exported. The target composite map image is drawn according to the coordinate set of the target pixel. The target composite map image can be composed of the boundary image of the area to be merged. At this time, the generated boundary image of the area to be merged may not include the area to be merged. The part where the boundary overlaps.
For example, in an embodiment of this specification, a black line with a transparency of 0.5 may be used to draw a map image of the area to be merged on the canvas or map drawing area, and the map image of the area to be merged may include the boundary image of the area to be merged. The map images of the area to be merged can form the initial merged map image. For details, please refer to the schematic diagram of the initial merged map image of the northeast map in FIG. 2. Traverse the pixels in the area to be merged in the initial merged map image. The transparency of the non-overlapping pixels at the boundary of the area to be merged is 0.5. The transparency of the overlapping pixels is usually greater than 0.5. Others inside the boundary image Because the image content is not drawn in the area, the transparency of the pixels is 0. A pixel with a transparency of 0.5 can be used as the target pixel, that is, the pixel in the non-overlapping part of the boundary image can be used as the target pixel. The target pixels are combined together to represent the boundary image of the merged area. At this time, the merged boundary image does not include the overlap of each area to be merged, which can represent the overall boundary contour of each area to be merged . The coordinate information of the target pixel can be extracted and saved, and the coordinate information corresponding to the target pixel can be exported to generate a merged target merged map image. FIG. 3 is a schematic diagram of a target merged map image of the northeast region after merge in an embodiment of this specification. As shown in FIG. 3, the merged target map image in an embodiment of this specification can overlap the boundaries of the regions to be merged Partial removal, and only the non-overlapping part of the boundary is retained, which intuitively shows the effect of map area merging, which is convenient for users to view.
Figure 4(a)-4(b) is a schematic diagram of transparency change detection in an embodiment of this specification. As shown in Figure 4(a), two images with a transparency of 0.5 are partially superimposed. It can be seen that the transparency of the image in the middle superimposed part is greater than the transparency of the other non-superimposed parts. Similarly, as shown in Figure 4(b), the two images without borders with a transparency of 0.5 are partially superimposed. It can be seen that the transparency of the image in the middle overlapped part is greater than that of the other non-overlapped images. transparency. In the embodiment of the present specification, by detecting the change in the transparency of the pixels, it is possible to accurately and quickly detect which parts of the region to be merged overlap and which parts do not overlap, so as to realize the rapid and accurate generation of the merged map image.
In addition, the embodiment of the present invention may also name the merged target merged map image according to the geographic location of the area to be merged. For example, in FIG. 3, the merged northeastern provinces are named as the northeast area.
The data processing method of map area merging provided in this manual can detect whether the transparency of pixels has changed through the on-canvas dyeing technology. Based on the difference between the transparency of pixels in the overlapping part and the pixels in the non-overlapping part during map merging, filter The pixel points that overlap the boundary and the pixels that do not overlap the boundary are further generated to generate a merged map image based on the pixels that do not overlap the boundary. The method is simple and fast, does not require complex data processing, and can accurately detect the boundary overlap, making the merging of the map area more accurate and fast. The merged map area can be interacted as a whole, which is convenient for subsequent use and has wide applicability.
On the basis of the foregoing embodiment, in an embodiment of the present specification, the use of lines with preset transparency to draw a map image of the area to be merged to generate an initial merged map image may include:
Acquiring the area to be merged selected by the user in the first map drawing area;
According to the area to be merged selected by the user, a map image of the area to be merged is drawn in the second map drawing area using lines of the preset transparency to generate the initial merged map image.
Specifically, when the user views the map in the first map drawing area (such as the canvas of a certain user terminal), if part of the map area needs to be merged, if the map area of the three northeastern provinces needs to be merged, the user can click or other Operate to select the area to be merged to be merged. For example, the user draws a complete map (such as drawing a map of China) in the first map drawing area by importing GeoJSON data, and clicks on the three northeastern provinces of Liaoning, Jilin, and Heilongjiang in the drawn China map, and selects Area to be merged. After recognizing the area to be merged selected by the user, you can draw the map image of the area to be merged in the second map drawing area (the second map drawing area can use the hidden canvas) with preset transparency lines to generate the initial merged map image For details, reference may be made to the foregoing embodiment to generate an initial merged map image, which will not be repeated here. The initial merged map image is generated based on the user's selection. The user can select the area to be merged according to actual needs to merge the map area. The method is simple and flexible, and the user experience is improved.
On the basis of the foregoing embodiment, in an embodiment of this specification, the generating a merged target merged map image according to the target pixels may include:
Removing pixels other than the target pixel in the initial merged map image from the initial merged map image;
Use a map image composed of the target pixels in the initial merged map image as the target merged map image.
Specifically, when the target pixels are determined and the target merged map image is generated based on the target pixels, the non-target pixels in the initial merged map image (that is, pixels other than the target pixels) can be merged from the initial Removed from the map image. At this time, only target pixels are left in the initial merged map image, and the remaining target pixels can be merged to form a merged target merged map image. For example, in the above-mentioned embodiment, non-target pixels are eliminated from the second map drawing area and the target pixels are retained. Then the image composed of the remaining target pixels in the second map drawing area can represent the merged target merged map. image.
The non-target pixels whose transparency does not meet the requirements are removed from the initial merged map image, and the remaining target pixels directly generate the merged target merged map image. The method is simple and the map area is merged accurately.
After removing the non-target pixels, the color of the non-target pixel positions can be set to be the same as the color of the pixels in the boundary inner area (that is, the inner area of the area to be merged) in the initial merged map image. This can prevent the color of the non-target pixels from being different from the colors of other areas inside the boundary of the area to be merged after the non-target pixels are removed, which affects the display effect of the merged map image. For example, if the initial merged map image is generated when the map image of the area to be merged is drawn, the initial merged map image has a background color. For example: the inner area of the boundary of the area to be merged is filled with red pixels, then the non-target pixels are removed Later, the color at the location of the non-target pixel point may become white or colorless, which is different from the color of other pixels in the inner area of the boundary of the area to be merged, which affects the display effect of the merged map image. After removing the non-target pixels, the color of the non-target pixels can be set to red, which is consistent with the color of the inner area of the boundary of the area to be merged, so as to improve the display effect of the merged map image. If the boundary of each area in the area to be merged is filled with different colors, that is, there are multiple colors at the non-boundary in the initial merged map image, the color of any pixel adjacent to the non-target pixel can be used as the non-target pixel The color at the position after culling.
The data processing method of map area merging provided in this manual, by setting uniform transparency, when merging map areas, the overlapped part transparency will change. Based on the change of transparency in the map image, the specified area will be merged. Based on the change of pixel transparency, the method of merging the specified map area is simple and fast, and does not require complicated mathematical calculations. The merged map area can be interacted as a whole and has strong applicability.
The various embodiments of the above method in this specification are described in a progressive manner, and the same or similar parts between the various embodiments can be referred to each other, and each embodiment focuses on the differences from other embodiments. For related details, please refer to the partial description of the method embodiment.
Based on the aforementioned data processing method for merging map areas, one or more embodiments of this specification also provide a data processing device for merging map areas. The described devices may include systems (including distributed systems), software (applications), modules, components, servers, clients, etc. that use the methods described in the embodiments of this specification, combined with necessary implementation hardware devices. Based on the same innovative concept, the devices in one or more embodiments provided in the embodiments of this specification are as described in the following embodiments. Since the implementation scheme of the device to solve the problem is similar to the method, the implementation of the specific device in the embodiment of this specification can refer to the implementation of the foregoing method, and the repetition will not be repeated. As used below, the term "unit" or "module" can be a combination of software and/or hardware that can implement predetermined functions. Although the devices described in the following embodiments are preferably implemented by software, the implementation of hardware or a combination of software and hardware is also possible and conceived.
Specifically, FIG. 5 is a schematic diagram of the module structure of an embodiment of the data processing device for map area merging provided in this specification. As shown in FIG. 5, the data processing device for map area merging provided in this specification includes: initial merged image The drawing module 51, the transparency acquisition module 52, and the target merged map generation module 53, wherein:
The initial merged image drawing module 51 may be used to draw a map image of the area to be merged using lines with preset transparency to generate an initial merged map image;
The transparency obtaining module 52 may be used to obtain the transparency of pixels in the initial merged map image;
The target merged map generating module 53 may be used to use pixels with the same transparency of the pixels as the preset transparency as target pixels, and generate a merged target merged map image according to the target pixels.
The data processing device for map area merging provided by the embodiment of this specification can detect whether the transparency of pixels has changed through the on-canvas dyeing technology, based on the transparency of the pixels in the overlapping part of the map and the merging and the transparency of the pixels in the non-overlapping part Differently, the pixels with overlapping borders and the pixels without borders are filtered out, and the merged map image is further generated based on the pixels without borders overlapping. The method is simple and fast, does not require complex data processing, and can accurately detect the boundary overlap, making the merging of the map area more accurate and fast. The merged map area can be interacted as a whole, which is convenient for subsequent use and has wide applicability.
On the basis of the foregoing embodiment, the initial merged image drawing module is specifically used for:
Acquiring the area to be merged selected by the user in the first map drawing area;
According to the area to be merged selected by the user, a map image of the area to be merged is drawn in the second map drawing area using lines of the preset transparency to generate the initial merged map image.
In the embodiment of this specification, the initial merged image rendering module is specifically used for:
Acquiring the area to be merged selected by the user in the first map drawing area;
According to the area to be merged selected by the user, a map image of the area to be merged is drawn in the second map drawing area using lines of the preset transparency to generate the initial merged map image.
On the basis of the foregoing embodiment, the target merged map generation module is specifically used for:
Removing pixels other than the target pixel in the initial merged map image from the initial merged map image;
Use a map image composed of the target pixels in the initial merged map image as the target merged map image.
In the embodiment of this specification, non-target pixels whose transparency does not meet the requirements are removed from the initial merged map image, and the remaining target pixels directly generate the merged target merged map image. The method is simple and the map area is merged accurately.
On the basis of the foregoing embodiment, the target merged map generation module is further used for:
The color at the position of the pixel point outside the target pixel point is set to be the same as the color of the pixel point inside the region to be merged in the initial merged map image.
In the embodiment of this specification, after the non-target pixels are eliminated, the color at the non-target pixels is set to red, which is consistent with the color of the inner area of the boundary of the area to be merged, thereby improving the display effect of the merged map image.
On the basis of the foregoing embodiment, the target merged map generation module is specifically used for:
The coordinate information corresponding to the target pixel is extracted, and the target merged map image is generated according to the set of coordinate information corresponding to the target pixel.
In the embodiment of this specification, the target merged map image is generated based on the collection of the coordinate information of the target pixel. The method is quick and does not require complex data processing, and can accurately detect the boundary overlap, making the merge of the map area more accurate and faster. The combined map area can be interacted as a whole, which is convenient for subsequent use and has wide applicability.
On the basis of the foregoing embodiment, the transparency acquisition module is specifically used for:
According to the coordinate information of the area to be merged, traverse the pixels in the area to be merged in the initial merged map image to obtain the transparency corresponding to the pixels in the area to be merged in the initial merged map image .
In the embodiment of this specification, the transparency change of each pixel is obtained by traversing the pixels inside the area to be merged. The method is simple, and the detection of pixels outside the area to be merged can be reduced, and the data processing speed is improved.
It should be noted that the above-mentioned device may also include other implementation manners according to the description of the method embodiment. For specific implementation manners, reference may be made to the description of the related method embodiments, which will not be repeated here.
In one embodiment of this specification, a computer storage medium may also be provided, on which a computer program is stored, and when the computer program is executed, the video data processing method in the above-mentioned embodiment is realized, for example, the following method may be realized:
Draw the map image of the area to be merged using preset transparency lines to generate the initial merged map image;
Acquiring the transparency of the pixels in the initial merged map image;
A pixel with the same transparency of the pixel as the preset transparency is used as a target pixel, and a merged target merged map image is generated according to the target pixel.
The foregoing describes specific embodiments of this specification. Other embodiments are within the scope of the attached patent application. In some cases, the actions or steps described in the scope of the patent application may be performed in a different order from the embodiment and still achieve desired results. In addition, the process depicted in the drawings does not necessarily require a specific order or sequential order of display to achieve the desired result. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The methods or devices described in the above-mentioned embodiments provided in this specification can implement business logic through computer programs and are recorded on a storage medium. The storage medium can be read and executed by a computer to achieve the effects of the solutions described in the embodiments of this specification.
The data processing method or device for merging map areas provided by the embodiments of this specification can be implemented by the processor in the computer by executing the corresponding program instructions, such as using the c++ language of the windows operating system on the PC side, the linux system, or others For example, the use of android and iOS system programming languages are implemented in smart terminals, and the processing logic based on quantum computers is implemented. In an embodiment of a data processing system for merging map areas provided in this specification, FIG. 6 is a schematic diagram of the module structure of an embodiment of a data processing system for merging map areas provided in this specification. As shown in FIG. 6, the implementation of this specification is The data processing system for merging map areas provided in the example may include a processor 61 and a memory 62 for storing executable instructions of the processor.
The processor 61 and the storage 62 communicate with each other through the bus 63;
The processor 61 is configured to call the program instructions in the storage 62 to execute the methods provided in the above-mentioned seismic data processing method embodiments, for example, including: drawing a map image of the region to be merged using lines with preset transparency , Generating an initial merged map image; acquiring the transparency of the pixels in the initial merged map image; using the pixel with the same transparency of the pixel as the preset transparency as the target pixel, according to the target pixel Generate the merged target merged map image.
It should be noted that the device, computer storage medium, and system described above in the specification may also include other implementation manners according to the description of the related method embodiments. For specific implementation manners, refer to the description of the method embodiments, which will not be repeated here.
The various embodiments in this specification are described in a progressive manner, and the same or similar parts between the various embodiments can be referred to each other, and each embodiment focuses on the differences from other embodiments. In particular, for the hardware+program embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant parts can be referred to the description of the method embodiment.
The embodiments of this specification are not limited to the conditions described in one or more embodiments of this specification that must comply with industry communication standards, standard computer data processing and data storage rules. Certain industry standards or implementations described in custom methods or examples with slight modifications can also achieve the same, equivalent or similar implementation effects of the foregoing examples, or predictable implementation effects after modification. The examples obtained by applying these modified or deformed data acquisition, storage, judgment, processing methods, etc., may still fall within the scope of the optional implementation scheme of the examples of this specification.
In the 1990s, the improvement of a technology can be clearly distinguished from the improvement of the hardware (for example, the improvement of the circuit structure of diodes, transistors, switches, etc.) or the improvement of the software (for the method flow Improve). However, with the development of technology, the improvement of many methods and processes of today can be regarded as a direct improvement of the hardware circuit structure. Designers almost always get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by the hardware entity module. For example, a Programmable Logic Device (PLD) (such as a Field Programmable Gate Array (FPGA)) is such an integrated circuit whose logic function is determined by the user programming the device. It is programmed by the designer to "integrate" a digital system on a PLD, without requiring the chip manufacturer to design and manufacture a dedicated integrated circuit chip. Moreover, nowadays, instead of manually making integrated circuit chips, this programming is mostly realized by using "logic compiler" software, which is similar to the software compiler used in program development and writing. The original code before compilation must also be written in a specific programming language, which is called Hardware Description Language (HDL), and there is not only one HDL, but many, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), Confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), Lava, Lola, MyHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., Currently the most commonly used are VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog. Those skilled in the art should also be clear that only need to logically program the method flow in the above-mentioned hardware description languages and program it into an integrated circuit, the hardware circuit that implements the logic method flow can be easily obtained.
The controller can be implemented in any suitable manner. For example, the controller can take the form of, for example, a microprocessor or a processor and a computer readable program code (such as software or firmware) that can be executed by the (micro) processor. Media, logic gates, switches, special application integrated circuits (Application
Specific Integrated Circuit, ASIC), programmable logic controllers and embedded microcontrollers. Examples of controllers include but are not limited to the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip
PIC18F26K20 and Silicone Labs C8051F320, the storage controller can also be implemented as part of the storage control logic. Those skilled in the art also know that in addition to implementing the controller in a purely computer-readable program code manner, it is completely possible to make the controller use logic gates, switches, special application integrated circuits, and programmable logic by logically programming method steps. Controllers and embedded microcontrollers can realize the same functions. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as a structure within the hardware component. Or even, the device for realizing various functions can be regarded as both a software module for realizing the method and a structure within a hardware component.
The systems, devices, modules, or units explained in the above embodiments may be implemented by computer chips or entities, or implemented by products with certain functions. A typical implementation device is a computer. Specifically, the computer can be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, and a game console. , Tablets, wearable devices, or any combination of these devices.
Although one or more embodiments of this specification provide method operation steps as described in the embodiments or flowcharts, conventional or non-inventive means may include more or fewer operation steps. The sequence of steps listed in the embodiment is only one way of the execution order of the steps, and does not represent the only execution order. When the actual device or terminal product is executed, it can be executed sequentially or in parallel according to the method shown in the embodiments or the drawings (for example, a parallel processor or multi-threaded processing environment, or even a distributed data processing environment). The terms "include", "include" or any other variants thereof are intended to cover non-exclusive inclusion, so that a process, method, product, or device that includes a series of elements includes not only those elements, but also other elements that are not explicitly listed. Elements, or also include elements inherent to such processes, methods, products, or equipment. If there are no more restrictions, it does not exclude that there are other identical or equivalent elements in the process, method, product, or device including the elements. Words such as first and second are used to denote names, but do not denote any specific order.
For the convenience of description, when describing the above devices, the functions are divided into various modules and described separately. Of course, when implementing one or more of this specification, the functions of each module can be implemented in the same or multiple software and/or hardware, or the modules that implement the same function can be composed of multiple sub-modules or sub-units. The combination of realization and so on. The device embodiments described above are merely illustrative. For example, the division of the units is only a logical function division, and there may be other divisions in actual implementation. For example, multiple units or components can be combined or integrated into Another system, or some features can be ignored, or not implemented. In addition, the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
The present invention is described with reference to flowcharts and/or block diagrams of methods, devices (systems), and computer program products according to embodiments of the present invention. It should be understood that each process and/or block in the flowchart and/or block diagram, and the combination of processes and/or blocks in the flowchart and/or block diagram can be realized by computer program instructions. These computer program instructions can be provided to the processors of general-purpose computers, dedicated computers, embedded processors, or other programmable data processing equipment to generate a machine that can be executed by the processors of the computer or other programmable data processing equipment Produce means for realizing the functions specified in one or more processes in the flowchart and/or one block or more in the block diagram.
These computer program instructions can also be stored in a computer-readable storage that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable storage produce a manufactured product including the instruction device , The instruction device realizes the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are performed on the computer or other programmable equipment to produce computer-implemented processing, so that the computer or other programmable equipment The instructions executed above provide steps for implementing functions specified in a flow or multiple flows in the flowchart and/or a block or multiple blocks in the block diagram.
In a typical configuration, the computing device includes one or more processors (CPU), input/output interfaces, network interfaces, and memory.
Memory may include non-permanent memory in computer readable media, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer-readable media includes permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology. Information can be computer-readable instructions, data structures, program modules, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), and other types of random access memory (RAM) , Read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only CD-ROM (CD-ROM), digital multi-function Optical discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage, graphene storage or other magnetic storage devices or any other non-transmission media that can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transient computer-readable media (transitory media), such as modulated data signals and carrier waves.
Those skilled in the art should understand that one or more embodiments of this specification can be provided as a method, a system or a computer program product. Therefore, one or more embodiments of this specification may adopt the form of a completely hardware embodiment, a completely software embodiment, or an embodiment combining software and hardware aspects. Moreover, one or more embodiments of this specification can be implemented on one or more computer-usable storage media (including but not limited to magnetic disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes. In the form of a computer program product.
One or more embodiments of this specification may be described in the general context of computer-executable instructions executed by a computer, such as a program module. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types. It is also possible to practice one or more embodiments of this specification in a distributed computing environment. In these distributed computing environments, remote processing devices connected through a communication network perform tasks. In a distributed computing environment, program modules can be located in local and remote computer storage media including storage devices.
The various embodiments in this specification are described in a progressive manner, and the same or similar parts between the various embodiments can be referred to each other, and each embodiment focuses on the differences from other embodiments. In particular, for the system embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant parts can be referred to the part of the description of the method embodiment. In the description of this specification, descriptions with reference to the terms "one embodiment", "some embodiments", "examples", "specific examples", or "some examples", etc. mean specific features described in conjunction with the embodiment or example , Structure, materials or features are included in at least one embodiment or example of this specification. In this specification, the schematic representations of the aforementioned terms do not necessarily refer to the same embodiment or example. Moreover, the described specific features, structures, materials or characteristics can be combined in any one or more embodiments or examples in a suitable manner. In addition, those skilled in the art can combine and combine the different embodiments or examples and the features of the different embodiments or examples described in this specification without contradicting each other.
The above descriptions are only examples of one or more embodiments of this specification, and are not used to limit one or more embodiments of this specification. For those skilled in the art, one or more embodiments of this specification can have various modifications and changes. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of this specification shall be included in the scope of the patent application.