CN217113409U - Self-service terminal and non-transitory computer readable medium - Google Patents
Self-service terminal and non-transitory computer readable medium Download PDFInfo
- Publication number
- CN217113409U CN217113409U CN202121931846.9U CN202121931846U CN217113409U CN 217113409 U CN217113409 U CN 217113409U CN 202121931846 U CN202121931846 U CN 202121931846U CN 217113409 U CN217113409 U CN 217113409U
- Authority
- CN
- China
- Prior art keywords
- user
- mobile device
- kiosk
- price
- mobile phone
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012015 optical character recognition Methods 0.000 claims abstract description 42
- 230000000007 visual effect Effects 0.000 claims description 57
- 230000003190 augmentative effect Effects 0.000 claims description 23
- 230000003993 interaction Effects 0.000 claims description 7
- 238000000034 method Methods 0.000 abstract description 41
- 238000012360 testing method Methods 0.000 description 112
- 238000001514 detection method Methods 0.000 description 88
- 238000005516 engineering process Methods 0.000 description 82
- 230000006854 communication Effects 0.000 description 21
- 238000004891 communication Methods 0.000 description 21
- 238000012545 processing Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 20
- 230000007246 mechanism Effects 0.000 description 13
- 230000008901 benefit Effects 0.000 description 12
- 238000011156 evaluation Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 238000012937 correction Methods 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 8
- 238000004064 recycling Methods 0.000 description 8
- 238000011179 visual inspection Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 238000007726 management method Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 5
- 239000004033 plastic Substances 0.000 description 5
- 229920003023 plastic Polymers 0.000 description 5
- 230000002265 prevention Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 230000002950 deficient Effects 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 241001422033 Thestylus Species 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000003058 natural language processing Methods 0.000 description 3
- 238000001454 recorded image Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000001010 compromised effect Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- -1 polyethylene Polymers 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 238000004904 shortening Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 239000004698 Polyethylene Substances 0.000 description 1
- HCHKCACWOHOZIP-UHFFFAOYSA-N Zinc Chemical compound [Zn] HCHKCACWOHOZIP-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000009118 appropriate response Effects 0.000 description 1
- 229910052785 arsenic Inorganic materials 0.000 description 1
- RQNWIZPPADIBDY-UHFFFAOYSA-N arsenic atom Chemical compound [As] RQNWIZPPADIBDY-UHFFFAOYSA-N 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 229910052793 cadmium Inorganic materials 0.000 description 1
- BDOSMKKIYDKNTQ-UHFFFAOYSA-N cadmium atom Chemical compound [Cd] BDOSMKKIYDKNTQ-UHFFFAOYSA-N 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011143 downstream manufacturing Methods 0.000 description 1
- 239000003673 groundwater Substances 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 1
- 229910052753 mercury Inorganic materials 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 231100000614 poison Toxicity 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- 239000004417 polycarbonate Substances 0.000 description 1
- 229920000573 polyethylene Polymers 0.000 description 1
- 229920001184 polypeptide Polymers 0.000 description 1
- 102000004196 processed proteins & peptides Human genes 0.000 description 1
- 108090000765 processed proteins & peptides Proteins 0.000 description 1
- 230000035755 proliferation Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 238000013403 standard screening design Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000003440 toxic substance Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- 229910052725 zinc Inorganic materials 0.000 description 1
- 239000011701 zinc Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F7/00—Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus
- G07F7/06—Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by returnable containers, i.e. reverse vending systems in which a user is rewarded for returning a container that serves as a token of value, e.g. bottles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/30—Administration of product recycling or disposal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
- G06Q30/0237—Discounts or incentives, e.g. coupons or rebates at kiosk
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0278—Product appraisal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02W—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
- Y02W90/00—Enabling technologies or technologies with a potential or indirect contribution to greenhouse gas [GHG] emissions mitigation
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Development Economics (AREA)
- Physics & Mathematics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Resources & Organizations (AREA)
- Game Theory and Decision Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Sustainable Development (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Cash Registers Or Receiving Machines (AREA)
- Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)
Abstract
Disclosed herein are embodiments of self-service terminals and non-transitory computer-readable media and systems and methods for evaluating device and presentation device bids through the use of Optical Character Recognition (OCR) systems and/or other associated devices and systems. The system may utilize a wireless charger to obtain information about the device while charging the device. The information may include the brand of the device. The system may direct the user to browse the device to provide additional information such as model number, storage capacity, unique identifier, and operator associated with the device, which may be extracted from the device using OCR. Based on, for example, the unique identifier, the system can determine whether the device has been stolen. If the device has not been stolen, the system may determine a price based on information obtained from the device and may present the price to the user. If the user accepts the price, the system may cause the purchase of the device.
Description
Cross reference to related applications
This application claims the benefit of 63/066,794 U.S. provisional patent application Ser. No. 8/17/2020 (attorney docket No. 111220-8058.US 00) and 63/116,020 U.S. provisional patent application Ser. No. 63/116,020/19/2020 (attorney docket No. 111220-8059.US 00), both of which are incorporated herein by reference in their entirety.
Technical Field
The present technology relates generally to methods and systems for evaluating and recycling mobile phones and other consumer devices, and more particularly to hardware and/or software for facilitating device identification, evaluation, purchase, and/or other processes associated with electronic device recycling.
Background
More mobile phones and other electronic devices (e.g., laptop computers, notebook computers, tablet computers, PDAs, MP3 players, wearable smart devices, etc.) are now in use than people on earth. The proliferation of mobile phones is due in part to their rapid growth. Due to the rapid development rate, mobile phones are replaced at a relatively high rate every year as consumers are continually upgraded to obtain the latest features or better operating plans (operating planes). According to the U.S. Environmental Protection Agency (u.s. Environmental Protection Agency) data, over 3.7 billion mobile phones, PDAs, tablet computers, and other electronic devices are disposed annually in the united states alone. Millions of other outdated or damaged mobile phones are thrown into an obsolete drawer or otherwise kept until a suitable disposal scheme is reached.
While many mobile phone retailers and mobile phone carrier stores now offer mobile phones to be changed for new or repurchase plans, many of the old phones are still eventually landfilled or improperly disassembled and disposed of in developing countries. Unfortunately, mobile phones and similar devices often contain environmentally hazardous substances such as arsenic, lithium, cadmium, copper, lead, mercury and zinc. If not properly disposed of, these toxic substances can infiltrate the groundwater and contaminate the soil by decomposition of the landfill, with potentially harmful consequences to humans and the environment.
As an alternative to retailers trading off for new or repurchase plans, consumers may recycle and/or sell their used mobile phones at kiosks located in shopping malls, retail stores, or other public areas. These kiosks are operated by ecoATM, llc, the assignee of the present application, and embodiments of these kiosks are described in, for example, U.S. patent nos. 8,463,646, 8,423,404, 8,239,262, 8,200,533, 8,195,511, and 7,881,965, each of which is incorporated herein by reference in its entirety.
SUMMERY OF THE UTILITY MODEL
A self-service terminal, the self-service terminal comprising: one or more processors; and a non-transitory computer-readable medium storing instructions that, when executed by the one or more processors, cause the one or more processors to: recording, by at least one camera of the kiosk, a visual representation of a mobile device placed in proximity to the at least one camera, wherein the visual representation includes perspective distortion of the mobile device due to a position of the at least one camera relative to the mobile device; creating an augmented reality representation of the mobile device based on the visual representation by causing the one or more processors to perform the steps of: correcting perspective distortion associated with the visual representation; generating a message to guide a user to cause the mobile device to visually display additional information associated with the mobile device; and combining the corrected visual representation and the message to generate the augmented reality representation; providing, by a user interface of the self-service terminal, the created augmented reality representation to the user; receiving visually provided additional information associated with the mobile device; extracting device information from the visually provided additional information using Optical Character Recognition (OCR); generating a first price for the mobile device and a second price for the mobile device based on the extracted device information, wherein the first price represents a value of the mobile device when the mobile device is damaged, wherein the second price represents a value of the mobile device when the mobile device is not damaged, and wherein the first price is lower than the second price; and presenting the first price and the second price to the user.
A non-transitory computer-readable medium storing instructions that, when executed by at least one computing device of a kiosk, cause the at least one computing device to: recording, by at least one camera of the kiosk, a visual representation of a mobile device placed in proximity to the at least one camera, wherein the visual representation includes perspective distortion of the mobile device due to a position of the at least one camera relative to the mobile device; creating an augmented reality representation of the mobile device based on the visual representation by causing the at least one computing device to: correcting perspective distortion associated with the visual representation; generating a message to guide a user to cause the mobile device to visually display additional information associated with the mobile device; and combining the corrected visual representation and the message to generate an augmented reality representation; providing, by a user interface of the self-service terminal, the created augmented reality representation to a user; receiving visually provided additional information associated with the mobile device; extracting device information from the visually provided additional information using Optical Character Recognition (OCR); generating a first price for the mobile device and a second price for the mobile device based on the extracted device information, wherein the first price represents a value of the mobile device when the mobile device is damaged, wherein the second price represents a value of the mobile device when the mobile device is not damaged, and wherein the first price is lower than the second price; and presenting, by the user interface, the first price and the second price to the user.
Drawings
FIG. 1 is an isometric view of a consumer-operated kiosk for purchasing a mobile electronic device from a user configured in accordance with an embodiment of the present technology.
2A-2C are a series of enlarged isometric views illustrating the structure and functionality associated with the kiosk detection area of FIG. 1, configured in accordance with embodiments of the present technology.
Fig. 3A-3C are front right, front left, and rear left isometric views, respectively, of the detection zone of fig. 2A-2C configured in accordance with an embodiment of the present technology, with the housing and other external structures removed to better illustrate internal components associated with the detection zone, and fig. 3D is a front view of a detection zone light fixture configured in accordance with an embodiment of the present technology.
FIG. 4A is a rear right isometric view of a mobile device sense disc assembly configured in accordance with an embodiment of the present technology, and FIG. 4B is a schematic diagram of a wireless charger mounted on the sense disc assembly in accordance with an embodiment of the present technology.
5A-5G are a series of front views of a kiosk detection area illustrating operation of a mobile device flipping mechanism configured in accordance with embodiments of the present technology.
6A-6C are a series of front views illustrating the operation of a mobile device flipping mechanism configured in accordance with other embodiments of the present technology.
7A-7C are a series of cross-sectional side views of the kiosk of FIG. 1 illustrating three stages of test tray operation in accordance with embodiments of the present technique.
FIG. 8 is a schematic diagram illustrating various components associated with the kiosk of FIG. 1 configured in accordance with embodiments of the present technique.
FIG. 9 is a schematic diagram of a suitable network environment for implementing aspects of an electronic device recycling system configured in accordance with embodiments of the present technology.
FIG. 10 is a front view illustrating a mobile electronic device located in the kiosk of FIG. 1 with an image of the mobile electronic device presented on the kiosk display, in accordance with embodiments of the present technology.
FIG. 11 is a front view similar to FIG. 10 illustrating keystone correction applied to a mobile electronic device image presented on a kiosk display in accordance with embodiments of the present technology.
FIG. 12 is a front view similar to FIG. 11, illustrating a visual representation of a user interacting with a mobile electronic device presented on a kiosk display in accordance with embodiments of the present technology.
FIG. 13A is a front view similar to FIG. 12 showing an augmented reality display presented on a kiosk display, and FIG. 13B shows a natural language user interface at the kiosk, in accordance with embodiments of the present technology.
FIG. 14A is an isometric view of the kiosk of FIG. 1 illustrating a Quick Response (QR) code displayed on the kiosk display in accordance with embodiments of the present technology.
FIG. 14B illustrates a user interface of a mobile electronic device displaying device testing including touch screen functionality in accordance with some embodiments of the present technology.
FIG. 15 illustrates a user interface of a mobile electronic device displaying a microphone test of the device in accordance with some embodiments of the present technology.
FIG. 16 illustrates a user interface of a mobile electronic device displaying Global Positioning System (GPS) testing of the device in accordance with some embodiments of the present technology.
FIG. 17 illustrates a user interface of a mobile electronic device displaying a display test of the device in accordance with some embodiments of the present technology.
18A-18B illustrate a flow diagram of a method of providing a mobile device purchase price, in accordance with some embodiments of the present technology.
FIG. 19 is a partial isometric view of a robotic stylus system configured in accordance with some embodiments of the present technology.
Detailed Description
The following describes various embodiments of systems and methods that enable users to sell, trade, or recycle mobile phones and other electronic devices. In some embodiments, the systems described herein include a kiosk or other structure having a wireless charger in or near a device detection area and an associated Optical Character Recognition (OCR) system. As described in more detail below, in various embodiments, a wireless charger may be used to obtain information about a device (e.g., a mobile phone) placed in proximity to the wireless charger, and this information may be used to determine an offer for the device. Embodiments of such wireless chargers are described in co-pending U.S. patent application attorney docket number 111220-8057.US01 and U.S. provisional patent application number 63/116,007 filed on 19/11/2020 (attorney docket number 111220-8057.US 00), which are incorporated herein by reference in their entirety. The kiosk may also direct the user to browse for the device and cause the device to display additional information such as model number, storage capacity, unique identifier (e.g., International Mobile Equipment Identification (IMEI) number) and/or operator. The kiosk may capture and evaluate the displayed information via, for example, one or more cameras and an OCR system.
Based on the unique identifier, the kiosk and/or associated system may determine whether the device has been stolen. If the device has not been stolen, the kiosk may determine an estimated price or price range for the device based on the obtained information and present the price or price range to the user. The price range may include high prices and low prices. If the user wishes to continue and sell the device, the kiosk may notify the user that further testing of the device may be required (e.g., to detect whether the device display has a flaw) before paying the premium, and that additional testing may take several minutes. If the user does not wish to wait, the user may accept the low price and the kiosk may continue to purchase the device at the lower price. Conversely, if the user wishes to further test the device in order to obtain a higher price, the kiosk may continue to further test the device and then provide a more accurate (and possibly higher) price based on a more thorough test. In some embodiments described herein, kiosks and other systems may be comprised of: causing the device to run one or more tests; imaging the device using one or more cameras to visually detect, for example, whether the device display screen has a flaw or other damage; and/or using a cable connected to the device and performing functional electrical testing of the device.
Certain details are set forth below and in figures 1-19 to provide a thorough understanding of various embodiments of the present technology. In other instances, well-known structures, materials, operations, and/or systems and the like, typically associated with smartphones and other handheld mobile electronic devices, consumer electronic devices, computer hardware, software, and network systems and the like, are not shown or described in detail below to avoid unnecessarily obscuring descriptions of the various embodiments of the present technology. One skilled in the relevant art will recognize, however, that the technology can be practiced without one or more of the details described herein, or with other structures, methods, components, and so forth.
The terminology used below is to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain examples of embodiments of the technology. Indeed, certain terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be overtly and specifically defined as such in this detailed description section.
Embodiments of the present technology are described in the drawings and the drawings are not intended to limit the scope thereof. Where certain component details are not necessary for a complete understanding of how the present technology is made and used, they may be abstracted away from the drawings to remove details such as the location of components and some precise connections between components. Many of the details, dimensions, angles, and other features shown in the figures are merely illustrative of particular embodiments of the present technology. Accordingly, other embodiments may have other details, dimensions, angles, and features without departing from the spirit or scope of the present technology. Furthermore, one of ordinary skill in the art will understand that further embodiments of the present technology may be practiced without some of the details described below. In the drawings, like reference numbers indicate identical or at least substantially similar elements. To facilitate the discussion of any particular element, the most significant digit or digits of any reference number refer to the figure in which that element is first introduced. For example, elements110 was first introduced and discussed with reference to fig. 1.
FIG. 1 is an isometric view of a consumer-operated kiosk 100 for purchasing mobile phones and other electronic devices from users, configured in accordance with embodiments of the present technology. The kiosk 100 includes a housing or enclosure 102 that supports a display screen 104 above a detection area access door 112. Access door 112 may be formed of, for example, various types of plastics (e.g., polyethylene, polycarbonate, etc.), glass, etc., transparent, opaque (opaque), or non-transparent (solid). The housing 102 may be made in a conventional manner, for example, from sheet metal, plastic panels, and the like. By way of example only, in some embodiments, the width W of the kiosk 100 may be about 7 inches to about 14 inches, or about 8 inches to about 9 inches; depth D is from about 12 inches to about 18 inches, or from about 14 inches to about 15 inches; the total height H is from about 3 feet to about 5 feet, or about 4 feet. The above dimensions are only one example; in other embodiments, a kiosk configured in accordance with the present disclosure may have other dimensions without departing from the present disclosure.
The front of the housing 102 provides a number of user interface devices for providing messages or instructions and other information to a user and/or for receiving user input and other information from a user. For example, in some embodiments, the display screen 104 may include a Liquid Crystal Display (LCD) or Light Emitting Diode (LED) display screen, a projection display (e.g., a heads-up display or head-mounted device), or the like, to provide information, prompts, or the like to the user. The display screen 104 may also display a Graphical User Interface (GUI), including a touch screen, for receiving user input and responses to displayed prompts. Additionally or alternatively, the kiosk 100 may include a separate keyboard or keys for receiving user input. The kiosk 100 may also include an ID reader or scanner 108 (e.g., a driver's license scanner), a fingerprint reader 118, and one or more external cameras 106 (e.g., digital cameras and/or video cameras).
Fig. 2A-2C are a series of enlarged isometric views illustrating the structure and functionality associated with the detection zone 216 of the kiosk 100. Referring first to FIG. 2A, in some embodiments, a user wishing to sell an electronic device (e.g., a second-hand mobile phone 210) via a kiosk 100 will first approach the kiosk 100 and follow the prompts displayed on the display screen 104. For example, such a prompt may ask the user what type of phone the user wishes to sell/trade, who the carrier is, etc. To begin the process, access door 112 (FIG. 1) is retracted upward behind display screen 104 to expose detection area 216. In other embodiments, the access door 112 may be stowed in a retracted position. Detection region 216 includes a detection tray 212 having a shelf 218. Detection disc 212 is sometimes referred to as "disc 212". The user is instructed (e.g., via a prompt on the display screen 104) to place the mobile telephone 210 on the shelf 218 so that it faces outward toward the user, and the electrical connector receptacle on the mobile telephone 210 faces downward toward the shelf 218, as shown in fig. 2A. In some embodiments, the user may also be required to turn on the mobile phone 210.
In some embodiments, the self-service terminal 100 includes a wireless charger 222 in relative proximity to the mobile phone 210 when the phone is on the test tray shown in FIG. 2A. For example, in the illustrated embodiment, wireless charger 222 is mounted on the back side of test tray 212 (which may also be referred to as the underside of test tray 212). For example, the wireless charger 222 may be a "Qi wireless charger" that operates according to the Qi open interface standard, which defines wireless power transfer using inductive charging over distances up to, for example, about 1.6 inches. In operation, the wireless charger 222 may provide a quick charge to the mobile phone 210 if the mobile phone 210 is placed on the test tray 212 and there is no power. Further, as described in more detail below, in some embodiments, the wireless charger 222 may receive certain information about the mobile phone (e.g., make, model, unique 32-bit identifier associated with the phone, and/or Qi standard, etc.) as part of the charging process.
In some embodiments, the unique ID of the mobile phone 210 is identified and the wireless connection between the wireless charger 222 and the mobile phone 210 is maintained during the self-service terminal transaction, enabling the self-service terminal 100 to monitor whether the user attempts to replace the mobile phone 210 with another mobile phone (e.g., another mobile phone of the same model but less valuable) at any time during the transaction. For example, if the connection between the wireless charger 222 and the mobile phone 210 is broken, the self-service terminal 100 requires a re-establishment of the connection prior to purchasing the mobile phone 210. For example, the re-establishment of the connection involves providing a unique ID in a header packet of the wireless charging protocol. By obtaining a unique ID in the header packet, the kiosk 100 can identify whether the mobile phone 210 has been replaced. If so, the self-service terminal 100 may stop the transaction, display an alert to the user, and/or take other steps to prevent the user from fraud by pricing one phone but actually selling a different phone, e.g., a lower value phone, to the self-service terminal.
In some embodiments, the information obtained from the wireless charger 222 may include only the brand of the mobile phone 210. In most cases, the brand of the mobile phone 210 will only be a part of the information needed to provide an accurate price quote for the device. Upon receiving the brand of the mobile phone 210, the kiosk 100 may offer to purchase the mobile phone 210 from the user at a low price based solely on the phone's brand, or if the user is willing to wait, the kiosk 100 may offer to make a more thorough evaluation of the mobile phone 210 and possibly a higher price to the user.
If the user decides to wait and authorize the kiosk 100 to perform a more thorough evaluation, the kiosk 100 may instruct the user (e.g., via a prompt displayed on the kiosk display screen 104) how to navigate through a menu on the mobile phone 210 while the mobile phone 210 is positioned on a test tray 212 as shown in FIG. 2A, so that the mobile phone 210 displays information about the device, such as model, operator, storage capacity, unique identification number (e.g., IMEI number), and so forth. The kiosk 100 may adjust the message or instructions presented to the user based on the brand of the mobile phone 210 obtained from the wireless charger 222. For example, a message or instruction presented to a user's Google (Google) device may be different from an Apple (Apple) ® ) Messages or instructions of the device. A camera (not shown in fig. 2A) in or near the detection area 216 may capture one or more images of the information displayed on the mobile phone 210, and an OCR system associated with the self-service terminal 100, such as OCR system 807 in fig. 8, may extract the device information from the images. Other embodiments are described in concurrently filed U.S. patent application Ser. No. 111220-8058, US01, and U.S. provisional patent application Ser. No. 63/066,794 filed on 8/17/2020 (attorney docket No. 111220-8058, US 00), which are incorporated herein by reference in their entirety. In some embodiments, the information obtained by the wireless charger 222 and/or associated OCR system may be used to present a more accurate price or price range to the user without connecting a cable from the self-service terminal 100 to the mobile phone 210.
If the kiosk 100 is unable to obtain the information needed to accurately price the mobile phone 210 by directing the user to navigate the phone display, for example because the user is unable to follow the provided message or instructions, or because the user does not want to spend time following the message or instructions, the kiosk 100 may present an electrical connector configured to connect to the mobile phone 210. For example, as described in more detail below, in some embodiments the kiosk 100 may present suitable electrical connectors (not shown) by extending them outwardly through holes in the shelf 218 so that when the user places the mobile phone 210 on the shelf 218 as shown in fig. 2A, he/she may connect the mobile phone 210 to the connectors. The kiosk 100 may determine the appropriate electrical connector to present to the user based on the brand of the mobile phone 210. The kiosk 100 may also select an appropriate electrical connector in response to, for example, a user's answer to a question presented on the display screen 104 regarding the type of phone the user wishes to sell. After the user connects the mobile phone 210 to the appropriate electrical connector, the kiosk 100 may perform an electrical detection of the device to obtain additional device information via the electrical connector, as described below with reference to FIG. 2B. In some embodiments, when test tray 212 is in the position shown in FIG. 2A, self-service terminal 100 may perform electrical testing via the electrical connector. In other embodiments, kiosk 100 may perform an electrical test on mobile phone 210 when test tray 212 is rotated to the position shown in FIG. 2B.
Before or after the kiosk 100 acquires the additional device information, the kiosk may perform a visual inspection of the mobile phone 210 to determine a defect of the mobile phone 210 (e.g., whether the device screen is cracked or whether the mobile phone 210 has other physical damage), as also described below with reference to FIG. 2B. For example, when the screen of the mobile phone 210 is broken, the price for the mobile phone will be lower than when the screen is not broken. After obtaining the device information and performing the visual inspection, the kiosk 100 may present the user with a more accurate price for the mobile phone 210.
Turning next to FIG. 2B, test tray 212 has been rotated back to a horizontal position so that mobile phone 210 is facing upward in test area 216. Although in some embodiments access door 112 (FIG. 1) is normally closed during this phase of kiosk operation, access door 112 is not shown in FIG. 2B (or FIG. 2C) to better illustrate the operation of detection tray 212 and detection zone 216. As described above, the kiosk 100 may perform electrical testing of the mobile phone 210 via the electrical connector to, for example, identify the phone and further evaluate the condition of the phone, as well as identify specific components and operating parameters, such as memory, carrier, and the like. For example, in some embodiments, the kiosk 100 (e.g., a kiosk Central Processing Unit (CPU) or other processing device) may query the mobile phone 210 (e.g., using an operating system Application Program Interface (API)) to obtain characteristic information about the mobile phone 210, which may include device identification, make, model, and/or configuration. In other embodiments, the characteristic information may also include device functionality, including hardware/software configuration, charging capabilities, storage capacity, and the like. The information needed to identify and/or evaluate a mobile device (e.g., mobile phone 210) may include, for example, a unique identifier (e.g., the IMEI code or mobile equipment identification code (MEID) of the mobile phone or a number equivalent thereto, a hardware media access control address (MAC address) of a networkable device, or a model and serial number of the electronic device), information describing the device manufacturer (e.g., manufacturer name or ID code), model, characteristics and performance (e.g., CPU type and speed, storage capacity (SRAM, DRAM, disk, etc.), wireless carrier, wireless frequency band (frequency range and coding, such as CDMA, GSM, LTE, etc.), color, and/or condition, etc One or more of the methods and/or systems described in detail in the application to perform electrical analysis.
Although the above-described embodiments include establishing an electrical connection between the mobile telephone 210 and the kiosk 100 via an electrical connector, in other embodiments the kiosk 100 may establish a wireless connection with the mobile telephone 210 in order to perform all or part of the telephone evaluation and purchase steps described herein. For example, in some embodiments, the kiosk 100 may include a radio transceiver accessible by a user device (e.g., mobile phone 210). The kiosk 100 may establish a wireless connection with the mobile phone 210 by providing a connection message or instruction and/or authentication information to the user for input via the mobile phone 210 and/or via the display screen 104. For example, the kiosk 100 may direct the user to make the bluetooth connection of the mobile phone discoverable and/or may provide a bluetooth pairing code that the user may enter on the screen of the mobile phone 210 or the touch screen of the kiosk 100. As another example, the kiosk 100 may provide a Wi-Fi network name and/or password that, when selected and/or entered on the user phone 210, enables the user to wirelessly connect the device to a designated Wi-Fi network. In other embodiments, establishing the connection may include providing a visual code or image (e.g., a QR code) for the user to scan using the mobile phone 210, such that scanning the code or image prompts the phone to connect to the self-service terminal's wireless network (e.g., upon confirmation by the user). In some embodiments, establishing a connection may include allowing a particular wireless device to join or use a wireless network or to make a wireless connection. For example, when the kiosk 100 detects the mobile phone 210 and determines that the device is registered for access or identified, the kiosk 100 automatically connects to the mobile phone 210 without further user authentication. In other embodiments, the user may load a mobile application onto the mobile phone 210, and the application may evaluate the electronic device and facilitate wireless communication between the mobile phone 210 and the self-service terminal 100 to facilitate phone evaluation and purchase of the self-service terminal 100. Various systems and methods for establishing a wireless connection between a self-service terminal 100 and a user's mobile phone or other electronic device are described in at least some patents and/or patent applications incorporated by reference herein in their entirety. In other embodiments, wireless connections between the kiosk 100 and mobile phones and other electronic devices may be established using other suitable means known in the art.
As described above, in addition to performing electrical detection, the kiosk 100 may also perform visual detection of the mobile phone 210 using one or more cameras (not shown) located in the detection area 216. In some embodiments, the visual detection may include 3D visual analysis (e.g., shape and/or size of the phone) to confirm an identity (e.g., make and model) of the mobile phone 210 and/or to assess or estimate a condition and/or functionality of the mobile phone 210 and/or various components and systems thereof. In some embodiments, test tray 212 may be colored (e.g., the test tray may be made of colored plastic, such as green colored plastic) so that chroma-key or chroma-key techniques (sometimes referred to as green screening if used with a green screen) may be used, for example, to remove test tray 212 from images of mobile phone 210 acquired by one or more cameras located in test area 216. Chromakeying the image may provide a better definition of the physical characteristics of the phone and enable the kiosk processor to better determine the size (e.g., width, height, and curvature) of the mobile phone 210 based on the image. As described in more detail below, in other embodiments, detection pad 212 is not green (or some other opaque color), but may be configured as a light table to enable the image to provide a more accurate profile of mobile phone 210 for visual analysis of, for example, phone size, shape, and the like.
Visual analysis may also include detecting whether the mobile phone 210 has cracks or other damage to the display screen (LCD), and whether the display screen has broken. In some embodiments, visual detection may include performing OCR to identify printed or displayed patterns, codes, and/or text and comparing characteristics (e.g., layout, size, font, color, etc.) of the patterns, codes, and/or text to templates to determine the presence of a device identifier such as a model number, serial number, IMEI number, etc. As described in more detail below, the visual inspection may be performed using one or more cameras, and the kiosk 100 may perform the visual analysis using one or more of the methods and/or systems specified herein and described in detail in the patents and patent applications incorporated by reference herein in their entirety. Some mobile phones include a unique identifier (e.g., IMEI number) that is printed or otherwise formed on the phone's subscriber identity module (also referred to as a "SIM" card) holder or disk. In some embodiments, the self-service terminal 100 may instruct the user to extract the SIM cartridge from their phone and have the SIM cartridge located on the test tray 212 so that the self-service terminal can perform OCR of the IMEI number on the SIM cartridge. Further, in some embodiments, the test tray 212 may include a designated area or small tray configured to accommodate a SIM cartridge so that the kiosk camera may acquire images of the IMEI number for OCR. In some mobile phones (e.g., the older iPhone and Samsung cell phones), the IMEI numbers may be printed or otherwise formed on the back side of the phone (the side opposite the display). In some embodiments, the kiosk may prompt the user to place such a phone on test tray 212, back side out, so that the kiosk camera may acquire an image with the IMEI number printed on the back surface for OCR by the kiosk software.
As described in more detail below, in one aspect of the illustrated embodiment, detection region 216 includes a device configured to flip mobile phone 210 so that the front side of the phone faces downward toward detection tray 212 when detection tray 212 is in the position shown in FIG. 2B. This enables the kiosk 100 to perform visual inspection of the back side of the mobile phone using the same imaging system (e.g., camera system, lighting system, etc.) used to detect the front side of the mobile phone 210. This feature eliminates the need to place the mobile phone 210 on a transparent surface and provide a camera below the transparent surface to visually inspect the back side of the mobile phone 210. In addition to saving cost, this feature may also save space and reduce the size of the kiosk 100.
In some embodiments, kiosk 100 may include a security feature that detects whether the user enters detection area 216 at an inappropriate time. For example, the security feature may detect whether a user enters detection area 216 when detection tray 212 is in the position shown in FIG. 2B, e.g., to remove mobile phone 210 and/or replace it with another phone. In some embodiments, the security feature may include a break-beam sensor system having an emitter 220a (e.g., an Infrared (IR) emitter, a laser beam emitter, etc.) mounted to a side wall portion of the detection zone 216 that is being within the opening of the doorway, and a corresponding sensor 220b (e.g., an IR receiver, a laser receiver, etc.) mounted to an opposite side wall portion on the other side of the opening to detect the beam emitted by the emitter 220 a. If the user extends their hand/arm through the access door opening, the light beam emitted by emitter 220a will be intercepted, and sensor 220b will sense the interception of the light beam. The sensor 220b may be configured to send a corresponding signal to the kiosk processor, and the kiosk 100 may respond by stopping the transaction, displaying an alert to the user via the display screen 104, or the like. In other embodiments, an internal camera located in the detection zone may be used by kiosk 100 to detect whether a user enters detection zone 216 when the detection tray is in the position shown in FIG. 2B.
After the mobile phone 210 is fully evaluated and the self-service terminal 100 has determined the purchase price, the purchase price may be offered to the user via the display screen 104. If the user accepts the purchase price offer, the access door 112 remains closed and the purchase transaction continues. For example, in some embodiments, the user may be prompted to place his or her identification (e.g., driver's license) in the ID scanner 108 and provide a fingerprint via the fingerprint reader 118 (fig. 1). As a fraud prevention measure, the kiosk 100 may be configured to transmit an image of the driver's license to a remote computer screen, and an operator on the remote computer may view through one or more cameras 106 (fig. 1), visually comparing the image (and/or other information) on the driver's license with a person standing in front of the kiosk 100 to determine that the person attempting to sell the mobile phone 210 is in fact the person determined by the driver's license. In some embodiments, one or more cameras 106 may be movable to view the self-service terminal user and other individuals in proximity to the self-service terminal 100.
In some embodiments, the detection tray may include a mirror 213 (fig. 2A) or other reflective surface incorporated into its upper surface to facilitate obtaining an image of the user, as described in more detail below with reference to fig. 7A. In addition, the fingerprint of the user may be checked against records of known fraud agents. In some embodiments, the user does not have to place their identification card (e.g., a driver's license) in the ID scanner 108, but may place their identification card in front of the camera 106 (fig. 1) external to the kiosk 100, and the camera 106 may capture images of the front and/or back side of the ID card. The image of the ID card may be sent to the remote computer so that an operator at the remote computer can visually compare the image to a user standing in front of the self-service terminal 100 as seen by the camera 106 and verify the identity of the user. The remote operator may be artificial intelligence, such as a convolutional neural network. The image of the ID card may also be stored in a database and associated with the mobile phone sold by the user. Further, the identification information may be read from the ID card image (e.g., via OCR, etc.) and checked against a database of potential fraud sellers as a means of fraud prevention. In such embodiments where the user's ID card or other form of identification is verified via the external camera 106 as described above, the ID scanner 108 (fig. 1) may not be necessary and may be omitted.
Once the user's identity is verified, test tray 212 is rotated further back, as shown in FIG. 2C, so that mobile phone 210 may be slid out of test tray 212 and into a storage box (not shown) (although access door 112 is normally closed during this stage of operation, access door 112 is omitted from FIG. 2C for clarity of illustration). The kiosk 100 may then provide payment for the purchase price of the user or the kiosk 100 may hold the phone and cause payment to the user as described herein. In some embodiments, the payment may be made in the form of cash dispensed from the payment outlet 110. In other embodiments, the user may receive payment from the mobile phone 210 in a variety of other useful ways. For example, the payment may be made via a redeemable cash voucher, ticket, electronic certificate, prepaid card, or the like, dispensed from the kiosk 100; or the user may be paid for via a gift code sent to the user via email, text message, or other form of electronic message, redeemable for a voucher, ticket, electronic certificate, or the like. Further, in some embodiments, the user may be paid via a wired or wireless currency (e.g., cash) deposit, via payment to an electronic account (e.g., bank account, credit account, points/members account, online commerce account, mobile wallet, etc.) such as PayPal, Venmo, etc.
Alternatively, if the user rejects the purchase price offer, or if the user's identity cannot be verified or the self-service terminal 100 determines that the transaction is at risk of fraud, the transaction may be rejected and the mobile phone 210 returned to the user. More specifically, test tray 212 is rotated forward to the position shown in FIG. 2A, and access door 112 is opened so that the user can retrieve mobile phone 210 from self-service terminal 100.
Fig. 3A-3C are isometric views of the front right, front left, and rear left of the detection zone 216, respectively, with some external structures omitted to better illustrate certain operational components associated with the detection zone 216, and fig. 3D is a front view of a lamp holder 332 used in the detection zone 216. Referring first to FIG. 3A, detection region 216 includes a camera 325 mounted above and facing downward toward detection tray 212. In some embodiments, camera 325 may be configured to capture still and/or video images of a mobile phone positioned on test tray 212. In some embodiments, camera 325 may include or be combined with one or more magnifying tools, scanners, and/or other imaging components (e.g., other cameras) to view, photograph, and/or visually evaluate the mobile phone from multiple perspectives. Further, in some embodiments, the camera 325 may be moved to facilitate device visual detection. In addition to camera 325, detection area 216 may include one or more lights directed toward detection tray 212 to facilitate visual detection by mobile phone 210. For example, the detection zone may include a pair of lights 338a, 338b mounted in suitable baffles on the back plate 356. Similarly, as shown in FIG. 3D, the detection zone 216 may also include a plurality of lights 358a-c carried in a suitable baffle on the light holder 332, as shown in FIG. 3A, mounted generally above the pusher member 322 toward the right of the detection zone 216.
As described above, in some embodiments, detection tray 212 may be configured as a light stand. In these embodiments, the detection tray 212 (or at least the detection surface portion 368 on which the mobile phone is placed on the detection tray 212) may be made of a translucent (e.g., semi-translucent) material, such as translucent glass or plastic. By way of example only, in some embodiments, the translucent detection surface portion 368 may be about 0.08 inches to 0.25 inches, or about 0.12 inches thick. In these embodiments, kiosk 100 may also include one or more lights 366a and 366B mounted on the kiosk tray (or other adjacent kiosk structure) below detection area 216 and positioned to project light upwardly through detection tray 212 during visual detection of, for example, mobile phone 210 (fig. 2B). The lamps 366a, 366b may include, for example, light emitting diodes (LEDs, e.g., white LEDs), fluorescent lamps, incandescent lamps, IR lamps, and the like. Configuring the test panel 212 as a light stand may enhance the contrast and profile of the device depicted in the device image captured by the camera 325 during visual evaluation of the mobile phone. This may facilitate more accurate assessment of the size, shape, external characteristics, etc. of the phone by the kiosk processor. Further, it is contemplated that in some embodiments, kiosk 100 may also include one or more UV light sources, which may be positioned to project UV light onto mobile phones in detection area 216 to clean the phones.
In the illustrated embodiment, detection region 216 also includes an urging member 322 positioned operatively toward the right side of detection tray 212, and a ramp member 324 positioned operatively toward the left side of detection tray 212. In operation, as described in more detail below, urging member 322 and ramp member 324 cooperate to flip a mobile phone placed on test tray 212 from side to side, e.g., from an upward position to a downward position. The ramp member 324 is pivotally mounted to a bracket (or other adjacent mounting structure not shown) by a pivot 334. In the illustrated embodiment, the ramp member 324 includes a generally smooth, curved (e.g., radiused) concave surface 327 facing the advancement member 322. In some embodiments, a lower portion of ramp member 324 may include a mirror 326 that enables a camera 325 to capture images of an adjacent side of a mobile phone (not shown) positioned on detection tray 212. In the illustrated embodiment, test plate 212 is pivotally mounted (e.g., via bearings) to a bracket (or other adjacent support structure; not shown) by a pivot shaft 336 fixedly attached to test plate 212 and enables test plate 212 to pivot between the positions shown in FIGS. 2A-2C.
Turning next to fig. 3B, a first pulley 346 is fixedly attached to the left end portion of pivot shaft 336. The first pulley 346 is operatively connected to the second pulley 342 by a drive belt 344 (e.g., a toothed rubber drive belt). The second pulley 342 is in turn fixedly attached to a drive shaft of a motor 340 (e.g., a stepper motor), which is mounted to a lower portion of the back plate 356. Thus, the motor 340 is operated (e.g., via a kiosk controller; not shown) to rotate the first pulley 346 via the second pulley 342 and the drive belt 344. Rotation of first pulley 346 rotates pivot shaft 336, which in turn rotates test plate 212. In this manner, operation of motor 340 may be used to rotate detection disk 212 between the three positions shown in FIGS. 2A-2C.
In the illustrated embodiment, urging member 322 includes a short vertical surface 378 extending upwardly adjacent to the upper surface of test plate 212, and an angled surface 380 extending upwardly from vertical surface 378. The pusher member 322 extends forward from the base 360. The bases 360 are slidably mounted on the upper and lower guide shafts 328a and 328b, respectively. More specifically, in the illustrated embodiment, the base 360 includes two cylindrical bores 376a and 376b (fig. 3C), and the guide shafts 328a, 328b may be cylindrical shafts that are slidably received in the respective bores 376a, 376b, respectively. The base 360 is movably coupled to the drive screw 330 by a threaded coupling 364 (e.g., a drive nut). The opposite end of drive screw 330 is fixedly attached to first pulley 350. As shown in fig. 3B, a first pulley 350 is operatively coupled to a second pulley 348 via a drive belt 352 (e.g., a toothed rubber drive belt). The second pulley 348 is fixedly attached to a drive shaft of a motor 354 (e.g., a stepper motor) mounted to the rear surface of the back plate 356. In operation, a kiosk controller (not shown) may operate the motor 354 to rotate the second pulley 348, which in turn drives the first pulley 350 to rotate the drive screw 330. Rotation of the drive screw 330 in the first direction causes the advancement member 322 to move on the guide shafts 328a, 328b toward the ramp member 324 across the test disc 212. Conversely, rotation of drive screw 330 in the opposite direction causes urging member 322 to move away from ramp member 324 and return to its starting position on the opposite side of test plate 212.
As shown in fig. 3C, the base 360 of the pusher member 322 includes a contact surface 370. In operation, as the push member 322 approaches the ramp member 324, the contact surface 370 contacts a contact feature 372 (e.g., a cylindrical pin) extending rearward from a lower portion of the ramp member 324. As the urging member 322 continues to move toward the ramp member 324 (i.e., from left to right in fig. 3C), the contact surface 370 drives the contact feature 372 to the right, causing the ramp member 324 to rotate counterclockwise about the pivot 334 (fig. 3B), as shown in fig. 3C, which is equivalent to rotating clockwise about the pivot 334 in fig. 3B. When the advancement member 322 is moved away from the ramp member 324, a return spring (not shown) and/or another biasing member operatively coupled to the ramp member 324 causes the ramp member 324 to rotate back to its original position.
FIG. 4A is a right rear isometric view of a detection disc 212 configured in accordance with an embodiment of the present technology. In the illustrated embodiment, electrical connector carrier assembly 478 is mounted to the underside of test tray 212 and moves as test tray 212 pivots between the three positions shown in FIGS. 2A-2C, described above. Connector carrier assembly 478 includes a motor 476 (e.g., a stepper motor) operably coupled to a cam shaft (not shown). The cam shaft includes a plurality of cam lobes, each cam lobe positioned to operatively move a corresponding one of a plurality of mobile device electrical connectors 474a-c (e.g., USB connectors, Android and iOS connectors, etc.) located in or near an opening 475 in shelf 218 of test tray 212. In operation, a kiosk controller (not shown) may activate the motor 476, which in turn rotates the cam shaft so that one of the cam lobes selectively drives a desired electrical connector (e.g., 474 b) outward through the opening 475 while the other electrical connectors (e.g., 474a and 474 c) remain in the opening 475. When the desired electrical connector 474 is in this position, the user can easily connect their mobile phone to the correct connector when placing their mobile phone on the shelf 218, as described above with reference to fig. 2A. As shown in FIG. 4A, the cam shaft configuration of connector carrier assembly 478 provides for compact placement of electrical connectors 474A-c on test tray assembly 470. In some embodiments, the correct electrical connector is selected based on the make and model of the phone that the user wishes to sell as determined via the display 104 (fig. 1). Once the mobile phone is electrically tested via the selected connector, the motor 476 may rotate the cam shaft to drive the selected connector back through the opening to disconnect the connector from the mobile phone. This enables the mobile phone to be flipped as described in detail below.
As described above with reference to fig. 2A, in some embodiments, the self-service terminal 100 includes a wireless charger 222 (e.g., a "Qi wireless charger") mounted (e.g., via a plurality of screws or other fasteners) on the back side of the test tray 212. The wireless charger 222 is positioned such that the charging pad 480 of the wireless charger 222 is relatively close to (e.g., less than 1.6 inches compared to) a mobile phone placed on the front side (detection surface) of the detection tray 212 for efficient charging of the phone. In some embodiments, the wireless charger 222 may be a wireless charger provided by the institute of semiconductor, jeopard, texas (STMicroelectronics), usa. In other embodiments, other wireless chargers may be used.
Fig. 4B is a schematic diagram illustrating a mounting arrangement of a wireless charger 222 configured in accordance with embodiments of the present technology. In the embodiment shown, the wireless charger 222 is electrically connected to the self-service terminal processor 402 (e.g., via a serial port) and receives power from the self-service terminal power supply 404Force. Kiosk processor 402 is sometimes referred to as "processor 402". In operation, when the wireless charger 222 is powered on, it wirelessly provides power to the mobile phone 210, and the phone responds by providing one-way communication to the wireless charger 222. The communication may include specific information about the phone, including, for example, may be to the device manufacturer (e.g., Apple) ® ,Samsung ® ,TI ® Etc.), a unique identifier associated with the phone, such as a unique 32-bit identifier, etc. As described herein, kiosk processor 402 may use this information to direct and facilitate telephone access and/or purchase transactions, among other useful purposes.
Fig. 5A-5G are a series of front views of the detection zone 216 illustrating multiple stages of operation of the flipping mechanism 320, in accordance with embodiments of the present technique. In these figures, the front portion of test tray 212 is omitted to better illustrate the operation of the inverter components. Beginning with FIG. 5A, mobile phone 210 is positioned on test tray 212 with its front side (e.g., display screen) facing upward, as indicated by arrow F. By way of example, this location corresponds to the mobile phone 210 being electrically and visually inspected as shown in FIG. 2B. For example, the mobile telephone 210 may be electrically inspected and evaluated using suitable electrical connectors 474A-c (FIG. 4A), and the front side of the mobile telephone 210 may be visually inspected and evaluated via the camera 325, as described above. Additionally, the sides (e.g., first or left side 586a and second or right side 586 b) of the mobile phone 210 may be evaluated via the camera 325 using one or more mirrors. For example, a mirror 326 (fig. 3A) located below the ramp member 324 may cause the camera 325 to acquire images of the left side 586a of the mobile phone 210, and a mirror 326 (fig. 3B) located toward a lower portion of the advancement member 322 may cause the camera 325 to acquire images of the right side 586B of the mobile phone 210. Once the mobile phone 210 has been electrically evaluated, the electrical connector 474 is separated from the mobile phone 210, as described above with reference to fig. 4A. Once the electrical connectors 474 have been separated and the front surface 585a and/or the side surfaces 586a, b of the mobile telephone 210 have been visually evaluated and/or imaged as desired, the mobile telephone 210 may be flipped over so that the back side of the mobile telephone 210 may be visually inspected via the camera 325.
Referring next to FIG. 5B, before the flipping process begins, camera 325 verifies that a mobile phone (or other electronic device) is on test tray 212 and that the phone is too large to flip. Upon confirming this, the pusher member 322 is moved from right to left as indicated by arrow D1 to begin the inversion process. When the push member 322 moves in this direction, the vertical surface 378 contacts at least a portion of the right side 586b of the mobile phone 210 and pushes the left side 586a against a lower portion of the curved surface 327 of the ramp member 324. This causes left side 586a to slide upward against curved surface 327 and right side 586b to slide right-to-left across the upper surface of test plate 212. If at any point the mobile phone 210 becomes jammed (as represented by, for example, an over-current drain on the motor 354; fig. 3B), the pusher member 322 reverses direction, returns to the starting position, and the process repeats. If the mobile phone 210 cannot be flipped after a preset number of attempts, the user may be notified via the display 104 (FIG. 1) and the phone may be returned to the user.
Continued movement of pusher member 322 from right to left as indicated by arrow D2 causes mobile phone 210 to move to a nearly vertical orientation, resting on ramp member 324 with its right side 586b bearing against the upper surface of test plate 212, as shown in fig. 5C. The reader will recall from the discussion of fig. 3C above that when the push member 322 is moved to this position, it interacts with the ramp member 324 and causes the ramp member 324 to rotate clockwise about the pivot 334 through an arc R1 as shown in fig. 5D. In some embodiments, the arc may be from 5 degrees to about 25 degrees, or about 10 degrees. Rotating the ramp member 324 in this manner moves the mobile phone 210 through an over-center position such that it lands on the angled surface 380 of the push member 322, as shown in fig. 5E. In this position, the mobile phone 210 is on the angled surface 380 with the front side of the phone (e.g., the display) facing the angled surface 380.
Turning next to fig. 5F, as the pusher member moves from left to right as represented by arrow D3, ramp member 324 returns to its original position through arc R2. As the pusher member 322 moves to the right, the left side 586a of the mobile phone 210 slides down along the angled surface 380, such that when the pusher member 322 returns to its original position, the mobile phone 210 lies flat on the test tray 212, with the front side now pointing down, as shown by arrow F in fig. 5G. In this position, the mobile phone 210 may be visually inspected by the camera 325 to determine, for example, whether there is any damage to the back surface of the mobile phone. Such damage may include, for example, cracks, gouges, damage to a cell phone camera, and the like. Further, as described above with reference to fig. 2B, on some mobile phones, the IMEI number is printed or otherwise formed on the rear surface of the mobile phone. In some embodiments, once the phone is turned over, the self-service terminal 100 may visually detect such phone (e.g., OCR) to read or otherwise obtain the IMEI number of the back surface of the mobile phone. In some cases, the position of the right side 586b of the mobile phone 210 may be set to abut against or too close to the mirror 326 at the lower portion of the ramp member 324, and thus, the camera 325 may not be able to obtain a satisfactory side profile image of the mobile phone 210. For this case, the kiosk may include one or more electromechanical vibrators 590 that vibrate the detection pad 212 in response to control signals from the kiosk processor 402, thereby moving the mobile phone 210 away from the mirror 326 so that the camera 325 may obtain a suitable side image. Further, in some embodiments, the position of test plate 212 may be set to be tilted downward from ramp member 324 by a slight angle to facilitate the aforementioned movement of mobile phone 210 in response to vibration of test plate 212.
Fig. 6A-6C are a series of front views illustrating various stages of operation of a flipping mechanism 320a configured in accordance with another embodiment of the present technique. Many of the components and features of the flipping mechanism 320a are at least substantially similar in structure and function to the flipping mechanism 320 described in detail above. However, in one aspect of the present embodiment, the flipping mechanism 320a is configured to flip the mobile phone 210 without requiring any "tapping" movement of the ramp member 324. For example, in some embodiments, a portion of the upper surface of detection disk 212 may curve or slope upward toward the base of ramp member 324, as shown by ramp feature 690. As shown in fig. 6A, when the pusher member 322 is moved from right to left, the first side 586A of the mobile phone 210 first slides up the ramp feature 690 and then slides up the curved surface 327 of the ramp member 324. As the pusher member 322 continues to push the right side 586B of the mobile phone 210 to the left, the right side 586B moves up the ramp feature 690 and further into the recess formed by the concave curved surface 327, as shown in fig. 6B. Thus, the mobile phone 210 is now in the over-center position, which causes the mobile phone 210 to fall to the right on the angled surface 380 of the push member 322 without any necessary "pushing" or pushing by the ramp member 324. As shown in fig. 6C, as pusher member 322 moves to the right toward its starting position, opposing sides 586a, 586b of mobile phone 210 slide down angled surface 380 and ramp feature 690, respectively, such that mobile phone 210 eventually rests face down on the upper surface of test tray 212. Thus, the embodiments described above with reference to fig. 6A-6C provide a method of flipping the mobile phone 210 without requiring the ramp member 324 to rotate or move.
While various embodiments of a flipping mechanism are described herein, it is contemplated that other flipping mechanisms configured in accordance with the present technology may also be provided to flip mobile phones and other mobile devices. For example, referring to fig. 6A-6C, it is contemplated that the ramp member 324 may be configured with a concave curve similar to curve 327, but with the upper portion extending farther to the right in fig. 6A relative to the base of the ramp member 324. By configuring the upper portion of the curved surface 327 to extend more in this direction, it is contemplated that mobile phones and other mobile devices may be moved to an over-center position using a pushing member at least substantially similar to the pushing member 322 described above without requiring the ramp member 324 to move or apply any tapping action to the mobile phone 210.
In some embodiments, the flipping mechanism 320 described in detail above may be used to flip mobile phones and other handheld electronic devices more than just. For example, in some embodiments, the flipping mechanism 320 may be used to flip a user's identification card (e.g., a driver's license or other ID card). In these embodiments, when test tray 212 is in the position shown in FIG. 2A, the user places their ID card face outward on the tray, and test tray 212 will then rotate back to the position shown in FIG. 2B. In this position, the detection area camera 325 (see, e.g., FIG. 5A) will capture an image of the front (front) side of the ID card, and then the flipping mechanism 320 will flip the ID card in the manner described above with reference to FIGS. 5A-5G so that the camera 325 can capture an image of the back side of the ID card. The image of the ID card may then be stored in a database and associated with the mobile phone sold by the user. Further, the identification information may be read from the ID card image (e.g., via OCR, etc.) and checked against a database of potential fraud sellers as a way of fraud prevention. The image may also be transmitted to a remote computer for display to a remote operator, who may compare the information on the ID card (e.g., the person's age, height, weight, etc.) to the user image obtained via the external camera 106 (fig. 1) to verify the user's identity. In such embodiments where the user's ID card or other form of identification is verified via the detection area camera 325 as described above, the ID scanner 108 (FIG. 1) may not be necessary and may be omitted.
In other embodiments, when test tray 212 is in the position shown in FIG. 2A, the user may place their ID card on the tray, and camera 325 may capture an image of the front (front) side of the ID card when the tray is in that position (i.e., not rotating back to the position shown in FIG. 2B). After imaging the front side of the ID card, the kiosk may instruct the user to flip the ID card in order to image the back side of the card when desired. The identification information may then be read from the ID card image (e.g., via OCR, etc.) and checked against a database of potential fraud sellers as a means of fraud prevention and/or stored in a database and associated with the mobile phone sold by the user as described above. The image may also be transmitted to a remote computer for display to a remote operator (e.g., artificial intelligence) who may compare the information on the ID card with an image of the user taken via the external camera 106 (fig. 1) to verify the identity of the user, as described above.
7A-7C are self-service illustrations of multiple stages of processing of a mobile device (e.g., mobile phone 210) in accordance with embodiments of the present techniqueA series of cross-sectional side views of the service terminal 100. As the reader will observe, the position of detection disk 212 in fig. 7A-7C corresponds to the position of detection disk 212 in fig. 2A-2C, respectively, described above. Turning first to FIG. 7A, this view shows mobile phone 210 positioned on test tray 212, with the bottom edge of mobile phone 210 abutting shelf 118. In some embodiments, wireless charger 222 may provide a quick charge to mobile phone 210 if a mobile phone 210 is placed on test tray 212 without power. In addition (even if the mobile phone 210 is fully or partially charged when placed on the test tray 212), the mobile phone 210 responds by providing information about the phone to the wireless charger 222 to receive power from the wireless charger 222. In some embodiments described above, this information may be sent to kiosk processor 402 (FIG. 4B) and may include a wireless power supply ID packet that may be used to determine, for example, the brand of the device. The brand of device may be used to determine appropriate questions/prompts, etc. to display the phone detection/purchase process to the user via the kiosk display 104. For example, if the mobile phone 210 is Apple ® Product, then display screen 104 may provide the user with a message or instruction to log out of their iCloud account so that the phone can be purchased by kiosk 100. Further, in some embodiments, the unique phone identification information received from the mobile phone 210 via the wireless charger 222 may be used to track downstream processing of the mobile phone, thereby alleviating the need for the user to place an identification label or sticker (e.g., a bar code sticker) on the phone for tracking purposes.
In some embodiments, the brand of the mobile phone 210 will be only a portion of the information needed to provide an accurate price quote for the mobile phone 210. Thus, if a brand is the only information available to the kiosk 100, the kiosk 100 may determine (e.g., via accessing an associated price database) a low end price of the price range for a particular brand of phone and propose to purchase the mobile phone 210 at that low price. In addition, the kiosk 100 may also present to perform a more thorough evaluation of the mobile phone 210 if the user is willing to wait, and possibly obtain a high price for the mobile phone 210. If the user chooses to sell the mobile phone 210 at this low price (e.g., by selecting an appropriate response to a prompt on the kiosk display 104), the kiosk 100 may retain the mobile phone 210 and pay the user as described below or the kiosk 100 may retain the phone and cause payment to the user as described herein. If the user decides to wait and authorize the kiosk 100 to perform a more thorough evaluation, the kiosk 100 may obtain additional device information, such as model number, operator, storage capacity, IMEI number, etc., by directing the user to provide additional information. For example, when the mobile phone 210 is in the position shown in FIG. 7A, the kiosk 100 may instruct the user (via the kiosk display 104) how to navigate through menus and/or other options/commands on the display of the mobile phone 210 to display information about the mobile phone 210. The kiosk 100 may adjust the message or instructions presented to the user based on the brand of the mobile phone 210 obtained from the wireless charger 222. As an example, a user may interact with a mobile phone touchscreen display to bring up an "about (about)" page using the "settings" menu of the phone. The about (about) page may display various types of information about the phone, which the kiosk 100 may capture via the camera 325 and process using, for example, associated OCR software. Such information may include, for example, model number, serial number, operating system/operating system version, IMEI number, IP address, MAC address, operator, memory configuration, user information, Cloud lock status, and the like. This information may be used by the kiosk 100 to determine (e.g., via accessing an associated pricing database) a more accurate price or price range (e.g., high and low prices) for the mobile phone 210 and present the price or price range to the user.
If the self-service terminal 100 is unable to obtain the necessary information by guiding the user, for example because the user is unable to follow the message or instruction, or if the user does not want to spend time following the message or instruction, the self-service terminal 100 may present an electrical connector (e.g., one of the electrical connectors 474A-c described above with reference to FIG. 4A) to connect to the mobile phone 210 as described above. Kiosk processor 402 may determine an appropriate electrical connector (e.g., a USB-C or Lightning) to present to the user on test tray shelf 218 based on, for example, the brand of mobile phone 210. After the user connects the electrical connector to the mobile phone 210, the kiosk 100 may retrieve the required information via electrical detection as described above. The kiosk 100 may then determine a more accurate price or price range for the mobile phone 210 based on this information and present the price or price range to the user. If the user is not interested in the proposed price or price range and no longer wishes to proceed with the transaction, the user may simply retrieve their phone from the self-service terminal 100. If the user decides to proceed with the transaction, the user may be required to attach a label (e.g., a bar code) with a unique code dispensed from the outlet 116 (FIG. 1) to the back of the mobile phone 210 for tracking purposes and then place the phone back on the test tray 212.
At this point, in some embodiments, the user may choose to sell the mobile phone 210 at a lower price of a more accurate price range, or the user may choose to further check the phone, for example, to check if the phone display is broken, to possibly obtain a higher price for the phone if the screen is not broken. If the user chooses to sell the mobile phone 210 at a lower price, the kiosk 100 may hold the mobile phone 210 and provide the user with a lower price payment as described below. Alternatively, if the user authorizes the kiosk 100 to further detect the mobile phone 210, e.g., detect damage to the cell phone display, and possibly propose a higher price, the access door 112 is closed and the detection tray 212 is rotated to the horizontal, gently sloping position shown in FIG. 7B. In this position, the mobile phone 210 may be visually inspected to determine if the device is broken, as described above. For example, as described above, in some embodiments, this includes visual inspection of the front side of the mobile phone 210, followed by visual inspection of the back side after flipping the phone using the flipping mechanism 320 described in detail above. As described above, lights 366a, 366b may facilitate visual detection in those embodiments where test tray 212 is configured as a light stand. If the display or other portion of the mobile phone 210 breaks, the price of the phone will be lower than if the device did not break.
In some embodiments, kiosk 100 may detect the condition of mobile phone 210 when test tray 212 is in the vertical position shown in FIG. 7A. For example, the camera 325 may image the mobile phone 210 before or after an OCR system (e.g., OCR system 807 in fig. 8) obtains information about the mobile phone 210, and the kiosk processor 402 may process the image as described above to detect cracks or other damage in the front side of the device (e.g., in the display screen) or on the back side of the phone after the user turns the mobile phone 210 over. Further, in some embodiments, the camera 325 may also facilitate identifying the mobile phone 210 by acquiring a device image at this time, which the processor 402 may process to determine dimensions, e.g., width, height, curvature of edges/corners, etc., and/or markings of the mobile phone 210 (e.g., markings identifying manufacturer, model, etc.). The determined dimensions, indicia, etc. may then be compared to known telephone dimensions, indicia, etc. stored in a local or remote database accessed by the processor 402 to identify the mobile telephone 210. In some embodiments, the kiosk lighting system in detection area 216 may be controlled (e.g., by turning on or off one or more lights, moving a light, etc.) to facilitate the above-described visual analysis of mobile phone 210. Performing a visual analysis of mobile phone 210 in the manner described above when test tray 212 is in the position shown in FIG. 7A may enable kiosk 100 to determine and present to the user the final price of the phone, rather than, for example, a price range, even before test tray 212 is rotated to the horizontal, gradual slope position shown in FIG. 7B. Thus, it should be understood that kiosk 100 may perform many different operations of evaluating, detecting, identifying, etc. a mobile device located on test tray 212 when test tray 212 is in the position shown in FIG. 7A.
As described above with reference to fig. 2A, in some embodiments, detection tray 212 may include one or more mirrors 213 located on or within an upper surface thereof, with mirror 210 being located, for example, adjacent to where mobile phone 210 is placed (fig. 2A) so that mobile phone 210 does not cover or block mirror 213. Additionally or alternatively, in other embodiments, the entire upper surface of the detection tray may be a mirror. In these embodiments, the user may be instructed via a suitable prompt on the kiosk display screen 104, for example, that the user is in front of the kiosk 100 so that the user's face is reflected in the mirror 213. The kiosk camera 325 may then capture an image of the user's face reflected in the mirror 213, and this image may be used to confirm the user's identity for the security and/or legal reasons described above. In other embodiments, one or more mirrors may be located elsewhere near the detection region 216 and positioned as described above to capture reflections of the user's face in the field of view of the camera 325. Using one or more mirrors 213 and cameras 325 to acquire images of the user may eliminate the need and cost of an external camera 106.
After obtaining the device information and performing the visual inspection, the kiosk 100 may determine the price of the mobile phone 210. For example, to determine the price, the kiosk 100 may use information about the make and model of the phone or one or more unique identifiers of the phone to look up the current price of the device in a database or pricing model. For example, the database or pricing model may be a local lookup table of public devices and/or a remotely hosted database or web service to which the kiosk 100 may send information about electronic devices and receive current market values or offers for electronic devices. After the purchase price has been determined, the offer may be presented to the user via the display screen 104.
If the user accepts the offer, the kiosk 100 may verify the user's identity and/or perform other fraud prevention measures described above with reference to FIG. 2B. Once these steps are satisfactorily completed, test tray 212 is rotated further rearward, as shown in FIG. 7C, causing mobile phone 210 to slide out of the rear of test tray 212 and into chute 792 leading to storage compartment 794. It should be noted that the front of test tray 212 includes a skirt 796 that prevents a user from reaching under test tray 212 and into storage bin 794 when access door 112 is open and test tray 212 is in the position shown in FIG. 7B. Once the phone is received in the storage box 794, the self-service terminal 100 may provide payment for the purchase price to the user, or the self-service terminal 100 may retain the phone and cause payment to the user as described herein. In some embodiments, the payment may be made in the form of cash dispensed from the payment outlet 110. In other embodiments, the user may receive payment from their mobile phone 210 in a variety of other ways. For example, the user may be paid via a cashable cash voucher dispensed from the kiosk 100, a ticket (e.g., a ticket for purchasing another mobile phone), an electronic certificate, a gift code, a prepaid card, etc.; alternatively, the kiosk 100 may effect payment by sending a gift code, redeemable voucher, ticket, electronic certificate, etc. to the user via email, text message, or other form of electronic message. Further, in some embodiments, the kiosk 100 may make payment to the user via a wired or wireless currency deposit, via the kiosk 100, via payment to an electronic account (e.g., a bank account, a credit account, a points account, an online commerce account, a mobile wallet, etc.), such as PayPal, Venmo, etc.
In other embodiments, the price of the mobile phone 210 presented to the user may be an offer or an offer range. For example, in some embodiments, the kiosk 100 may provide the user with a range of offers for the mobile phone 210 and pay a final price for the phone as a result of subsequent evaluations of the mobile phone 210 by human operators of the remote facility. The highest price quote may be obtained based on a manual detection confirming that the mobile phone 210 is in the same state as determined by the self-service terminal 100, while a lower price quote may be obtained based on a manual detection confirming that the mobile phone 210 is in a worse (e.g., more damaged) state than initially determined by the self-service terminal 100. In such an embodiment, if the user wishes to proceed with the sale transaction based on the quoted price (or price), the self-service terminal 100 receives the mobile phone 210, but the self-service terminal 100 does not immediately pay the phone price to the user. Rather, after the kiosk operator has taken the mobile phone 210 from the kiosk 100 and the phone has undergone manual testing to confirm the condition, the kiosk 100 may then hold the phone and cause payment to the user as described herein, or may pay the user a final price based on the condition (e.g., a high or low price) by, for example, mailing a check or by any number of different electronic payment methods including, for example, sending an electronic certificate, gift code, ticket, redeemable voucher, etc. via email, text message, or other form of electronic message, or paying to an electronic account (e.g., bank account, credit account, points account, online business account, mobile wallet, etc.) via wired or wireless monetary deposits.
Although only one storage box (i.e., storage box 794) is shown in FIGS. 7A-7C, in some embodiments, the self-service terminal 100 may include two or more storage boxes for storing different types of mobile phones and/or for storing phones that may require different types of postal receipt processing. For example, in some embodiments, the storage box 794 may be a first storage box for storing mobile phones that will be collected from the self-service terminal and subjected to normal resale processing, and the self-service terminal 100 may include a second storage box (not shown) that receives mobile phones that may require some type of special processing or evaluation. Placing the second type of phone in the second storage box enables the operator to quickly reach (access) such a phone if an evaluation, report, etc. is required. As an example, to implement the second storage bin, the chute 792 may include two exit paths and baffles (not shown) or similar devices to direct the mobile phone into the appropriate storage bin.
As understood by those of ordinary skill in the art, the foregoing processes are but a few examples of the ways in which the kiosk 100 may be used to purchase, recycle, or otherwise process consumer electronic devices such as mobile phones. Furthermore, it should be understood that the configuration of the kiosk 100 described above is merely a suitable move that may be used with embodiments of the present technologyAn example of a mobile device evaluation, purchase, and/or recycling system. Accordingly, other embodiments of the present technology may use other systems without departing from the present technology. Although the foregoing examples are described in the context of a mobile phone, it should be understood that kiosk 100 and its various embodiments may also be used in a similar manner to recycle almost any consumer electronic device, such as MP3 players, tablets, laptops, e-readers, PDAs, Google ® Glass ™ Smart watches, and other portable or wearable devices, as well as other relatively non-portable electronic devices, such as desktop computers, printers, televisions, DVRs, devices used to play games, entertainment or other digital media on CDs, DVDs, blu-rays, and the like. Further, while the foregoing examples are described in the context of being used by consumers, the kiosk 100 in its various embodiments may similarly be used by others (e.g., store clerks) to assist consumers in recycling, selling, exchanging their electronic devices, and the like.
FIG. 8 provides a schematic diagram representative of the architecture of a kiosk 100 in accordance with an embodiment of the present technology. In the illustrated embodiment, the self-service terminal 100 includes a suitable processor or Central Processing Unit (CPU) 402 that controls the operation of the self-service terminal 100 as described above in accordance with computer-readable instructions stored on the system memory 806. Processor 402 may be any logic processing unit, such as one or more CPUs, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), or the like. Processor 402 may be a single processing unit or multiple processing units in an electronic device or distributed across multiple devices. The processor 402 is connected to the memory 806 and may be internally coupled to other hardware devices and high-speed peripheral devices, such as by using a bus (e.g., a USB 3.0 hub 824, PCI-Express, or serial ATA bus, etc.). As an example, the processor 402 may comprise a standard Personal Computer (PC) (e.g., Dell ® Optiplex 7010 PC); or running any suitable operating system, for example, Windows ® embedded computers of Windows (e.g., Windows 8 Pro operating system), Linux ™, Android ™, iOS, or other types of embedded computers of an embedded real-time operating system. In some embodiments, processor 402 may be a small PC with an integrated Hard Disk Drive (HDD) or Solid State Drive (SSD) and Universal Serial Bus (USB) or other port to communicate with other components of kiosk 100. In other embodiments, processor 402 may comprise a microprocessor with a separate motherboard that interfaces with a separate HDD. The memory 806 may include read-only memory (ROM) and random-access memory (RAM) or other storage devices, such as disk drives or SSDs, that store executable applications, test software, databases, and/or other software necessary to, for example, control kiosk components, process electronic device information and data (e.g., to assess device manufacturer, model, status, pricing, etc.), and communicate and exchange data and information with remote computers and other devices, and the like. Program modules may be stored in the system memory 806, such as an operating system, one or more application programs, other programs or modules, and program data. The memory 806 may also include a web browser for allowing the kiosk 100 to access and exchange data with web sites over the internet.
The processor 402 may also control the operation of electronic, optical, and electromechanical systems included with the kiosk 100 for electrical, visual, and/or physical analysis of electronic devices placed in the kiosk 100 for purchase or recycling. Such a system may include: one or more internal cameras (e.g., camera 325) for visual inspection of the electronic device, e.g., for determining external dimensions and/or determining conditions, e.g., whether and to what extent the LCD display of the mobile phone is cracked; and electrical connectors 474a-c (e.g., USB connectors), for example, for powering on and performing electrical testing of mobile phones and other electronic devices. Processor 402 is also operatively connected to connector carrier assembly 478 to control the dispensing of electrical connectors 474a-c, and to motors 340 and 354 to control the movement of test tray 212 and pusher member 322, respectively, as described above. The kiosk 100 also includes a plurality of mechanical components 822, the plurality of mechanical components 822 being electrically actuated to perform various functions of the kiosk 100 during operation. The mechanical component 822 may include, for example, the detection zone access door 112 (fig. 1). The kiosk 100 also includes a power source 404, which may include a battery power source and/or a device power source, for operation of various electrical components associated with kiosk operation.
In the illustrated embodiment, kiosk 100 also includes a network connection 812 (e.g., a wired connection such as an Ethernet port, cable modem, FireWire (FireWire) cable, lightning connector, USB port, etc.) adapted to communicate with, for example, various processing devices (including remote processing devices) via communication link 808, and a wireless transceiver 813 (e.g., including a Wi-Fi access point; Bluetooth transceiver; Near Field Communication (NFC) device; wireless modem or cellular radio using GSM, CDMA, 3G, 4G, and/or 5G technologies, etc.) adapted to communicate with, for example, various processing devices (including remote processing devices) via communication link 808 and/or directly via, for example, a wireless peer-to-peer connection. For example, the wireless transceiver 813 may facilitate wireless communication with an electronic device, such as the electronic device 810 (e.g., mobile phone 210), to wirelessly evaluate the electronic device, for example, via a mobile application loaded on the device. Such communication with electronic device 810 may occur when the device is near kiosk 100 (e.g., in or near detection area 216) or when the device is remote from the kiosk. In other embodiments, the kiosk 100 may include other components and features that may differ from those described above, and/or one or more of the components and features described above may be omitted.
In the illustrated embodiment, the electronic device 810 is depicted as a handheld device, such as the mobile telephone 210. However, in other embodiments, electronic device 810 may be other types of electronic devices, including, for example, other handheld devices; a PDA; an MP3 player; tablet, notebook, and laptop computers; an electronic reader; a camera; a desktop computer; a TV; a DVR; a game machine; google Glass ™ polypeptides; smart watches, and the like. By way of example only, in the illustrated embodiment, the electronic device 810 may include one or more features, applications, and/or other elements that are common in smartphones and other known mobile devices. For example, electronic device 810 may include a CPU and/or a Graphics Processing Unit (GPU) 834 for executing computer-readable instructions stored on memory 806. In addition, electronic device 810 may include an internal power supply or battery 832, a dock connector 846, a USB port 848, a camera 840, and/or well-known input devices including, for example, a touch screen 842, keys, and the like. In many embodiments, the electronic device 810 can also include speakers 844 for bi-directional communication and audio playback. In addition to the foregoing features, electronic device 810 may include an Operating System (OS) 831 and/or a device wireless transceiver, which may include one or more antennas 838 for wireless communication with, for example, other electronic devices, websites, and kiosk 100. Such communication may be performed via, for example, communication link 808 (which may include the internet, a public or private intranet, a local or extended Wi-Fi network, a cellular tower, a Plain Old Telephone System (POTS), etc.), direct wireless communication, and so forth.
FIG. 9 is a schematic diagram of a suitable network environment for implementing aspects of an electronic device recycling system 900 configured in accordance with an embodiment of the present technology. In the illustrated embodiment, a plurality of self-service terminals 100 (labeled as self-service terminals 100a-100n, respectively) may exchange information with one or more remote computers (e.g., one or more server computers 904) via a communication link 808. Although communication link 808 may include a publicly available network (e.g., the internet with a network interface), a private communication link (e.g., an intranet or other network) may also be used. Further, in various embodiments, each kiosk 100 may be connected to a host computer (not shown) that facilitates the exchange of information between the kiosk 100 and remote computers, other kiosks, mobile devices, and the like.
The server computer 904 may perform many or all of the functions for receiving, routing, and storing electronic messages, such as web pages, audio signals, and electronic images, necessary to carry out the various electronic transactions described herein. For example, server computer 904 may retrieve and exchange web pages and other content with database 906 having one or more associations. In some embodiments, database 906 may include information associated with mobile phones and/or other consumer electronic devices. Such information may include, for example, brand, model, serial number, IMEI number, operator plan information, pricing information, owner information, and the like. The server computer 904 may also include a server engine 908, a web page management component 910, a content management component 912, and a database management component 914 in various embodiments. The server engine 908 may perform the basic processing and operating system level tasks associated with the various techniques described herein. Web page management component 910 can handle the creation and/or display and/or routing of web pages or other display pages. Content management component 912 may handle many of the functions associated with the routines described herein. Database management component 914 may perform various storage, retrieval, and query tasks associated with database 906 and may store various information and data such as animations, graphics, visual and audio signals, and the like.
In the illustrated embodiment, the kiosk 100 may also be operatively connected to a number of other remote devices and systems through a communication link 808. For example, the kiosk 100 may be operatively connected to a plurality of user devices 918 (e.g., personal computers, laptop computers, handheld devices, etc.) having associated browsers 920. Similarly, as described above, the respective service-assist terminal 100 includes a wireless communication device for exchanging digital information with an electronic device with wireless support, such as electronic device 810 (e.g., mobile phone 210). The kiosk 100 and/or the server computer 904 may also be operatively connected to a series of remote computers to acquire data and/or exchange information with necessary service providers, financial institutions, device manufacturers, authorities, government agencies, and the like. For example, the kiosk 100 and server computer 904 may be operably connected to one or more cellular carriers 922, one or more device manufacturers 924 (e.g., mobile phone manufacturers), one or more electronic payment or financial institutions 928, one or more databases (e.g., gsimei databases, etc.), and one or more computers and/or other remotely located or shared resources associated with the cloud computing 926. Financial institution 928 may include all entities associated with conducting financial transactions, including banks, credit/debit card facilities, online commercial facilities, online payment systems, virtual cash systems, money transfer systems, and the like.
In addition to the above, the kiosk 100 and server computer 904 are also operatively connected to a reseller market 930 and a kiosk operator 932. The resell market 930 represents a system of remote computers and/or service providers associated with reselling consumer electronic devices through electronic and physical channels. For example, such entities and facilities may be associated with an online auction for reselling used electronic devices and for determining market prices for such devices. Kiosk operator 932 may be a central computer or computer system for controlling all network modes of operation of kiosk 100. Such operations may include, for example, remote monitoring and facilitating kiosk maintenance (e.g., remotely testing kiosk functions, downloading operating software and updates, etc.), services (e.g., periodically replenishing cash and other consumables), performance, and so forth. Additionally, kiosk operator 932 may also include one or more display screens operatively connected to receive images from one camera (e.g., one or more of cameras 106 and 325) located at each kiosk 100. This remote viewing capability enables an operator to verify user identification and/or make other visual observations in real-time at the self-service terminal 100 during a transaction. This may include a remote operator remotely evaluating an image of the electronic device to rank the physical condition of the device.
The above description of the electronic device recycling system 900 illustrates one possible network system suitable for implementing various techniques described herein. Accordingly, those of ordinary skill in the art will appreciate that other systems consistent with the present techniques may omit one or more of the facilities described with reference to fig. 9, or may include one or more additional facilities not described in detail in fig. 9.
Although specific circuitry is described above, one of ordinary skill in the art will recognize that a microprocessor-based system may also be used in which any logic decisions are configured in software. The foregoing discussion of FIGS. 8 and 9 provides a brief, general description of a suitable computing environment in which the present technology may be implemented. Although not required, aspects of the technology are described in the general context of computer-executable instructions, such as routines executed by a general-purpose data processing device (e.g., a server computer, wireless device, or personal computer). Those skilled in the art will appreciate that aspects of the technology may be practiced with other communication systems, data processing systems, or computer system configurations, including: internet appliances, handheld devices (including Personal Digital Assistants (PDAs)), wearable computers, various cellular or mobile phones (including internet protocol network (VoIP) phones), dumb terminals, media players, gaming devices, multi-processor systems, microprocessor-based or programmable consumer electronics, set-top boxes, network pcs, minicomputers, mainframe computers, and the like. Indeed, the terms "computer," "server," "host system," and the like are generally used interchangeably herein and refer to any of the devices and systems described above as well as any data processor.
FIG. 10 is a front view of the kiosk 100 after a user places an electronic device 1000 (e.g., a mobile phone) on a test tray 212 ("tray 212") of the kiosk 100. As described above with reference to FIG. 1, kiosk 100 may include kiosk display screen 104, from the perspective of camera 325 (FIGS. 3A, 7A) included in kiosk detection area 216, to display a perspective view 1010 of device 1000 on tray 212. The perspective view 1010 may be part of an image or video recording of the device 1000. As shown in fig. 10, a perspective view 1010 of the device 1000 (which may also be referred to as a "keystone effect") presents the device 1000 as a trapezoid, where a first side 1020 of the device 1000 closest to the camera 325 appears longer than a second side 1030 of the device 1000 away from the camera 325.
When the user places the device 1000 on the tray 212, the tray is in a tilted position, as shown in FIG. 7A. In fig. 7A, it can be seen that camera 325 is viewing device 1000 at an angle, with the result that the device appears distorted as shown in perspective view 1010.
FIG. 11 is similar to FIG. 10, but shows keystone correction applied to an image of the device 1000 displayed on the kiosk display screen 104. A hardware or software processor associated with the kiosk 100, such as processor 402 in FIG. 4B, may receive a perspective view 1010 of the device 1000 as shown in FIG. 10. The processor 402 may correct for perspective distortion and prevent shortening of the distance away from the camera 325, such as shortening the second side 1030 of the device 1000. Thus, the kiosk display 104 may display an augmented reality display that includes the corrected image 1100, where the device 1000 appears rectangular, or at least approximately rectangular, rather than trapezoidal. The user may not realize that the corrected image 1100 does not represent the actual image visible to the camera 325.
In some embodiments, camera 325 in FIG. 7B has a direct view (which may also be referred to as a vertical or "non-angled" view) of device 1000 when disk 212 is rotated to the horizontal, gentle slope position shown in FIG. 7B. Thus, there is no (or at least relatively little) perspective distortion in the camera 325 view of the device 1000. Accordingly, the processor 402 associated with the camera 325 may forego performing keystone correction before displaying a visual representation of the device 1000 on the kiosk display screen 104 or, in some embodiments, on a display of a remote computing device used by a remote operator to view the device 1000. In some embodiments, processor 402 may make a decision not to perform keystone correction based on tray 212 being in a horizontal position and/or based on access door 112 being closed. In other embodiments, processor 402 may analyze the recorded image of device 1000 and upon determining that there is no (or relatively little) perspective distortion in the recorded image, e.g., no trapezoidal shape in the image, the processor may forego performing keystone correction.
FIG. 12 is a front view of the kiosk 100 similar to FIG. 11, showing a visual representation on the kiosk display screen 104. The visual representation may be one or more images and/or videos of the device 1000. For example, in the illustrated embodiment, the visual representation includes user interaction with the device 1000. More specifically, in the present embodiment, the user has inserted their hand into a portion of the detection region 216 to touch the display of the device 1000, and the interaction is displayed on the kiosk display screen 104. Kiosk display screen 104 may display the user interacting with device 1000 in corrected image 1200.
FIG. 13A shows an augmented reality display on kiosk display 104. While the user is interacting with device 1000, a processor associated with kiosk 100 may create an augmented reality display on kiosk display screen 104 by superimposing one or more prompts 1300 or user messages on real-time video 1310 of device 1000. For example, as shown in FIG. 13A, the user's finger may be visible on the kiosk display screen 104.
The prompt 1300 may instruct the user how to navigate through the device 1000 so that the display of the device 1000 may display an image 1320 showing information about the device, such as:
brand of
Model number
Sequence number
Operating System (OS) and OS version
IMEI (Primary and Secondary)
IP address (IPv 4/6)
BT Wi-Fi MAC Address
Operator (AT & T, Verizon et al)
Memory configuration
User information: e-mail, telephone, name
Cloud Lock status (via new user login screen)
The messages or instructions may vary based on the make and/or model of the device 1000. To determine the make and/or model of the device 1000, the camera 325 may record an image/video of the device 1000. Based on the recorded images/video, the processor may determine the dimensions (e.g., width, height, and/or thickness) of device 1000, as well as other visual attributes, such as the shape of curvature of the corners of device 1000. For example, Apple ® iPhones ® Having a particular corner curvature, the corners of both Android and Microsoft telephone were tend to be sharper than iPhones. Based on the determined device characteristics (e.g., width, height, and/or curvature), the processor may determine the make and/or model of the device 1000. In other embodiments, the information acquired by the wireless charger (e.g., wireless charger 222; FIG. 2A) may be used by the self-service terminal 100 to determine the make and/or model of the device 1000. Thus, based on the make and/or model of the device, the processor may provide device-specific messages or instructions that may cause the device 1000 to provide additional information about the device.
For example, prompt 1300 may instruct the user to go to a "settings" menu on device 1000 and select the menu item "about phone," which may provide all or part of the information listed above. The camera 325 may record an image of the display screen 1302 of the device 1000 and the processor may perform OCR to obtain necessary information about the device 1000. In another example, prompt 1300 may instruct the user to dial "# 06 #" on device 1000, which may call up information of device 1000, such as the IMEI number.
The image 1320 displayed on the device 1000 may vary in color, such as black on white, or white on black. Font sizes and types in the image 1320 may also vary. Thus, the kiosk 100 may include one or more light sources 1330 that provide suitable illumination for the device 1000. Light source 1330 may change illumination based on the color of image 1320 and the color of the fonts in the image. For example, illumination may be used to increase the visibility of fonts in the image 1320, such as when the fonts are black, the illumination may be white, and when the fonts are white, the illumination may be colored and/or may be low intensity. In addition, the processor may adjust the exposure, focus, etc. of the camera 325 to provide a clear image 1320 with a suitable color contrast so that fonts can be read from the image 1320.
Prompt 1300 may also direct the user to the user's personal account, such as their iCloud account, so that kiosk processor 402 may confirm that the user logged off from the personal account. Prompt 1300 may instruct the user to log out of the personal account if the user is still logged into the personal account. In some embodiments, this may be necessary because the user may retain rights to the device 1000 even if the device is stored within the kiosk 100 while the user is still logged into a personal account, such as their iCloud account. Thus. Before allowing the receiving device 1000, using OCR, the processor 402 determines whether the user is logged out from one or more personal accounts.
Further, after the user decides to sell the device 1000, the kiosk 100 may instruct the user to perform factory reset using the prompt 1300, for example. An OCR system (e.g., OCR system 807 in fig. 8) may observe that a user initiated a factory reset. When the factory reset is started, the device 1000 OS displays a unique progress screen indicating the start. The OCR system may detect this unique progress screen to confirm the initiation of the factory reset. Restoring the device to factory settings is a lengthy process and thus the kiosk 100 may take it as the last interaction with the device 1000 before the device 1000 is sold by the user. In one embodiment, the user may initiate the factory reset, receive payment from the device 1000, and then leave the device 1000 to complete the factory reset within the self-service terminal 100. Once the processor 402 performs OCR to obtain the necessary information about the device 1000, such as the make, model, memory capacity, and/or carrier of the device, the processor may provide one or more prices for the device. The first price may be lower than the second price and may represent the price of the device 1000 if the device is damaged/defective, and the second price may represent the price of the device 1000 if the device is not damaged/defective. Further, if desired, the prompt 1300 may indicate that an additional amount of time, such as a few minutes, is required to determine whether the device 1000 is damaged or defective, or whether the device 1000 is worth a second, higher price. The user may determine whether the additional amount of time is worth waiting, taking into account the difference between the two prices.
If the user does not wish to wait, the user may accept a lower price and the kiosk 100 may continue to purchase the device at the lower price as described herein. Alternatively, if the user wishes to further test the device to obtain a higher price, the kiosk 100 may continue to further test the device and then propose a more accurate (and possibly higher) price based on the more thorough test. In some embodiments as described herein, the kiosk 100 and other associated systems may further detect the device 1000 by: running the device on one or more tests; using one or more cameras (e.g., camera 325) that image the device to detect, for example, whether there is a crack or other damage to the device display screen; and/or using cables, such as electrical connectors 474A-c in fig. 4A, or wireless connections (e.g., wireless transceiver 813; fig. 8) that connect to the device and perform, for example, electrical detection of the device's function, etc.
The kiosk 100 may detect a false even if the user does not follow the guidance provided by the prompt 1300 but instead shows a false unique identifier of the device 1000, such as a false IMEI number. For example, in some embodiments, kiosk 100 may include a robotic stylus (stylus) 1340 positionally operably disposed in or near detection region 216 and which may interact with a display of device 1000, such as by interacting with a touch sensor of the device. For example, the robotic stylus 1340 may scroll the screen of the device 1000. When the user presents an image of the IMEI, the robotic stylus 1340 can interact with the display screen 1302 of the device 1000 to ensure that the device screen can be scrolled. If the user displays a still image with a false IMEI, the screen of the device 1000 cannot be scrolled but can only be moved. Using the camera 325 and OCR system, the processor 402 can detect whether the screen is being scrolled or moved. If the screen is simply moved, the processor 402 may determine that the user has presented a static image and notify the user of the actual unique identifier of the display device 1000. If the user declines, the self-service terminal 100 may terminate the transaction.
Additionally, the OCR system may determine whether the device 1000 has been stolen. For example, if the self-service terminal 100 receives that the device 1000 has been turned off, a wireless or wired charger associated with the self-service terminal 100 may power up the device 1000. When the stolen device 1000 is powered on, the device displays a notification that the device has been stolen, such as a "find my iPhone" notification. The notification may be in a standard format and have a predefined location on the device 1000 display. The OCR system may recognize the notification and ask the user to disable the notification. If the user cannot disable the notification, the self-service terminal 100 can reject the device 1000. In another example, to determine whether device 1000 is stolen, kiosk 100 may access a database containing a list of unique identifiers of stolen devices. If the unique identifier associated with the device 1000 is in the stolen device list, the kiosk 100 may reject the device 1000 and terminate the transaction.
Based on the unique identifier, e.g., IMEI number, the kiosk 100 may determine whether the device 1000 has high resale value, e.g., newer models of phones like Pixel 3 or iPhone X. If the device 1000 has a high resale value, the kiosk 100 may require the user (e.g., via appropriate prompts presented on the kiosk display screen 104) to conduct more tests to determine a more accurate price. The kiosk 100 may present offers to the user that require the user to perform multiple tests that take a total of several minutes, and in exchange, the user may receive a higher purchase price for the device 1000. The kiosk 100 may present the offer at the kiosk display screen 104, or the kiosk 100 may cause the device 1000 to present the offer to the user, and the kiosk display screen 104 may show the offer presented on the device 1000. If the user accepts the offer, the kiosk 100 may instruct the user how to perform additional tests. 14A-17, described below, illustrate aspects of the test that the kiosk 100 may cause the device 1000 to perform to provide a more accurate price. To perform the various tests described below with reference to fig. 14A-17, a user may remove device 1000 from kiosk 100 and, upon completion of a task, the user may return the device to the kiosk (e.g., by repositioning device 1000 on test tray 212 with tray 212 in the position shown in fig. 7A). Alternatively, the user may interact with the device 1000 during testing and when the device is placed on the tray 212 of the kiosk 100.
FIG. 13B illustrates a natural language user interface at the kiosk 100. In some embodiments, the kiosk may include a natural language processing system, and in addition to interacting with the kiosk 100 using the kiosk display 104 or the kiosk button 1350, the user may also interact with the kiosk using natural language. The natural language processing system may include a microphone 816 and a processor, such as processor 402 in fig. 4B. The processor 402 may support a machine learning model, such as a neural network or a transformer trained to understand natural language. The processor 402 may be an Artificial Intelligence (AI) accelerator such as a Tensor Processing Unit (TPU), a Graphics Processing Unit (GPU), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or the like.
For example, kiosk display 104 may display a message 1360 to the user indicating a wake up word that the user may use to activate natural language interaction. Once the user speaks a wake-up word, e.g., "hey, eco," the kiosk 100 may respond by issuing an audio prompt to the user and/or the kiosk may respond by displaying a visual prompt on the kiosk display screen 104. For example, kiosk display 104 may display visual cues that include commands 1370, 1380 that the natural language processing system may recognize when the user speaks, such as "get my estimate" or "drop. The commands 1370, 1380 may vary depending on what the kiosk 100 may recognize at a particular point in time.
In another example, if the user says "get estimate," the commands 1370, 1380 may change to the states "tell me how to do," "enter numbers," "help," or "return," as these are commands that the kiosk 100 can recognize at that point in time. Alternatively, or in addition to the visually displayed commands 1370, 1380, the kiosk may provide available commands to the user through audio emitted by one or more speakers 820. The kiosk 100 may use one or more directional microphones 816 to enable audio interaction with the user by recording and understanding the user's voice in a noisy environment (e.g., a mall).
Once the user speaks the wake up word, the kiosk 100 may wait for user input for a predetermined amount of time, such as 5 or 10 seconds. If the user does not provide input within a predetermined amount of time, the kiosk 100 may return to a sleep mode until the kiosk receives a wake word.
FIG. 14A is an isometric view of kiosk 100 showing a unique code (e.g., a QR code, a bar code, etc.) displayed on kiosk display screen 104. In some embodiments, to run a test on device 1000, kiosk 100 may direct a user to download a software application to device 1000, where the software application is configured to run one or more tests on device 1000. In other embodiments, as shown in FIG. 14A, to simplify the process without requiring the user to download a software application, the kiosk 100 may display a QR code 1430. The user may point the camera of the device 1000 at the displayed QR code 1430. When device 1000 detects/identifies QR code 1430, device 1000 may launch a web browser and direct the user to a website. The website may run one or more of the tests described below with reference to fig. 14B-17 via a browser launched by device 1000 (e.g., via HTML5 and Javascript).
Turning next to fig. 14B, to test the network access of the device 1000, the kiosk 100 may indicate or otherwise cause the device 1000 to access a particular web page 1400 by, for example, having the device 1000 scan a QR code 1430, as described above. If the device 1000 successfully accesses the web page 1400, it indicates that the device has normal network access. To test the camera of device 1000, device 1000 may scan QR code 1430 shown in FIG. 14A, which may initiate testing of other functions of the device.
FIG. 14B illustrates a user interface of device 1000 showing testing of a device including touch screen functionality in accordance with some embodiments of the present technology. In some embodiments, the user may run the test if the user has accepted the offer. To test touch screen functionality, device 1000 can display an object (e.g., square 1410) that can be moved (e.g., horizontally and vertically) across screen 1420. In each new position of square 1410, device 1000 may ask the user to select square 1410. Device 1000 may determine whether the selection was detected and whether the selection matches the position (or approximate position) of square 1410. If the selection matches the location of square 1410 for a threshold number (or all) of the locations of square 1410 on screen 1420, device 1000 may determine that the touch screen functionality is operational.
FIG. 15 illustrates a user interface of device 1000 showing testing of a microphone of device 1000 in accordance with some embodiments of the present technology. Device 1000 may instruct the user how to test the microphone via audio, video, text, pictures, or other similar means. For example, device 1000 can provide a selection (e.g., button) 1520 to start testing and recording audio, and a selection (e.g., button) 1500 to stop testing. The device 1000 can display a visualization 1510 of the recorded audio.
FIG. 16 illustrates a user interface of a device 1000 that displays a test of a device's location capabilities (e.g., GPS), in accordance with some embodiments of the present technology. For example, device 1000 may test GPS by determining a location of device 1000 using GPS and communicating the location of device 1000 to a user via audio, video, text, pictures, or other similar means. For example, device 1000 may display the detected location 1600 of device 1000 on a map. The device 1000 may request that the user provide confirmation of the detected location 1600, for example, by using buttons 1610, 1620, and/or by using audio communication.
FIG. 17 illustrates a user interface for testing of a display of display device 1000 in accordance with some embodiments of the present technology. Device 1000 may display one or more colors, such as red, blue, and/or green, on display 1302 of device 1000. Device 1000 may present a query to a user via audio, video, text, pictures, or other similar means, such as: "in the area where we are cycling colors, you see any pixels, elements, or points that do not change color" the user can respond to the query using buttons 1710, 1720, and/or by using audio communication. The "no" response may indicate to kiosk processor 402 that display screen 1302 is defect free; conversely, a "yes" response may indicate that the display screen 1302 may have one or more defects, such as defective pixels.
18A-18B illustrate a flow chart of a method for determining and providing a purchase price (e.g., dollar value) or other compensation value for a mobile device (e.g., mobile device 1000). In step 1800, a processor, such as processor 402 in FIG. 4B, may record a visual representation of a mobile device placed in proximity to at least one camera (e.g., mobile device placed on test tray 212 based on camera 325; FIG. 7A). The visual representation may include perspective distortion due to the position of the at least one camera relative to the device, as shown in fig. 10.
In step 1810, the processor may create an augmented reality representation based on the visual representation. For example, as described above with reference to fig. 11-13, the processor may correct perspective distortion associated with the visual representation, generate a message to guide the user to visually cause the device to provide additional information associated with the device, incorporate the corrected visual representation and the message or instructions to introduce the user to the augmented reality representation, and provide the augmented reality representation to the user. The device information may include a brand of the device, a model of the device, an operator associated with the device, a storage capacity associated with the device, or any combination thereof.
In a more specific example, the generated message or instruction may direct the user to log off from one or more personal user accounts associated with the device. In a different particular example, the processor may generate a message or instructions to direct the user to cause the device to visually provide additional information associated with the device based on the brand of the device. The message or instruction directing the user may vary based on the brand of the device. The processor may use an OCR system (e.g., OCR system 807 in fig. 8) to determine the brand of the device by determining, for example, the curvature of the corners of the device from a visual representation of the device. For example, when the device is placed in a tray (e.g., tray 212 shown in fig. 10), the processor may take a picture and determine the curvature of the device based on the picture. As described herein, the curvature of the device may represent the brand of the device.
The processor may determine whether to perform a correction for perspective distortion based on the orientation of the device or based on the state of a housing surrounding the device. For example, when the device is in a position of gentle slope as shown in FIG. 7B, the processor may decide that correction of perspective distortion is not necessary because the camera 325 in FIG. 7A is looking directly at the device 210 (e.g., at a perpendicular angle relative to the device). Similarly, when the access door 112 in fig. 7B is closed, the processor may not perform the correction because the user cannot interact with the device 210 when the access door 112 is closed, and therefore no correction of perspective distortion is required.
In step 1815, the processor may provide the created augmented reality representation to the user by, for example, displaying the augmented reality representation at a kiosk display (e.g., kiosk display screen 104 described above).
In step 1820, the processor may receive visually provided additional information associated with the device. The visually provided additional information may be provided on the display of the mobile device in the form of, for example, images and/or videos. In step 1830 (FIG. 18B), the processor may extract the device information from the visually provided additional information using, for example, Optical Character Recognition (OCR).
In step 1840, based on the extracted device information, the processor may generate one or more prices for the device, such as a first price for the device and a second price for the device, and an indication of an amount of time. If the device is damaged, the first price may represent the value of the device. The second price may represent the value of the device if the device is not damaged. In some embodiments, the processor does not perform a full check of the device until the first price and the second price are provided, and thus the processor does not know whether the device is damaged. Thus, if the processor determines that the device is damaged when the device is detected, the first price represents an approximate value of the device. Similarly, if the processor determines that the device has not been compromised when the device is detected, the second price represents an approximate value of the device. Thus, the first price is lower than the second price. The representation of the amount of time may represent a wait time for the processor to evaluate the device, determine whether the device is damaged, and propose an appropriate price (e.g., a first price to the user based on the determined device condition).
In step 1850, the processor may provide the user with an indication of a first price for the device, a second price for the device, and an amount of time via, for example, kiosk display 104. If the processor receives a selection of the second price from the user, this indicates that the user is willing to wait for the processor to perform a full test of the device. The user is likely to select the second price for the device when the user knows that the device is in good condition and that a full detection of the device will result in a second price being proposed or a price close to the second price. The processor may determine a condition of the device. Upon determining that the device has not been compromised, the processor may present the user with a second price for the device or a price similar to the second price.
As described above, in some embodiments, the processor may also detect whether the device is stolen. If the device is stolen, the processor may refuse to purchase the device. To detect whether the device is stolen, the processor may power up the device using the wireless charger if the received device has been powered off. A lost and/or stolen device may display an alarm when powered on to indicate that the owner of the device has reported that the device was lost and/or stolen. The processor may use OCR to determine whether the device displays an alert indicating that the device has been reported stolen and/or lost. Upon detecting the alert, the processor may provide a notification to the user to disable the displayed alert. The true owner of the device may disable the alarm. The processor may determine whether the displayed alarm is disabled. Upon determining that the displayed alarm is not disabled, the processor may determine that the device is stolen and refuse to accept the device and/or take other steps.
The processor may enable voice interaction with the user by providing recognizable voice commands to the user, as described above with reference to fig. 13B. The processor may obtain audio spoken by the user and may match the audio spoken by the user with at least one recognizable voice command, such as "help", "enter numbers", "give me an estimate", "restart", etc. The processor may execute the command upon finding a match between the audio spoken by the user and the recognizable voice command. If the processor is unable to match the spoken voice with the command, the processor may ask the user to repeat the command or make a selection using a button or touch screen on the kiosk.
A non-transitory computer-readable medium may store instructions described herein that, when executed by at least one computing device, may perform a method as generally shown and described herein, and equivalents thereof.
As described above with reference to fig. 13A, in some embodiments, kiosk 100 may include a robotic stylus 1340 (which may also be referred to as a robotic finger) whose location is operably disposed within or near detection region 216 and which may interact with a display of device 1000 (e.g., by interacting with a touch sensor of the device). For example, fig. 19 is a partial schematic isometric view of a robotic stylus system 1910 configured in accordance with some embodiments of the present technology. In the illustrated embodiment, robotic stylus system 1910 includes a guide or track 1912 located on or near a surface of detection tray 212, and a carriage arm 1916 extending at a right angle therefrom and having a proximal end portion movably (e.g., slidably) coupled to track 1912. More specifically, in some embodiments, a proximal end portion of the carrier arm 1916 may be coupled to a motor (e.g., an electric stepper motor) controlled drive belt on the track 1912 to move the carrier arm 1916 up and down along a longitudinal axis 1914 of the track 1912 as indicated by arrow 1930. In other embodiments, the carrier arm 1916 may be coupled to the track 1912 by a pinion that mates with a rack that extends longitudinally on the track, and activation of the pinion (via, for example, a motor) may move the carrier arm 1916 up and down the rack in the indicated direction. In another aspect of this embodiment, the robotic stylus system 1910 also includes a stylus holder 1920, the stylus holder 1920 being movably coupled to the carrier arm 1916 by, for example, a belt on the carrier arm 1916, which belt on the carrier arm 1916 is configured to move the stylus holder 1920 left and right along an axis 1918 of the carrier arm 1916 as indicated by arrow 1932 when activated by a drive motor. Stylus holder 1920 includes a stylus 1340 on a distal portion thereof. In the illustrated embodiment, the bracket arm 1916 is also configured to rotate about an axis 1918 as indicated by arrow 1934 to move the stylus 1340 towards and away from the device display 1302 as needed to interact with the display 1302, as described above with reference to fig. 13A. In operation, device 1000 may be positioned on inspection tray 212 adjacent to robotic stylus system 1910 as shown in fig. 19. In some embodiments, test plate 212 may include an adhesive pad on which device 1000 is placed to help hold device 1000 in place on the plate. The various drive motors of robotic stylus system 1910 may be controlled by the kiosk processor to move carriage arm 1916 and stylus carriage 1920 along and/or about axes 1914 and 1918, respectively, as needed to move stylus 1340 to interact with device display 1302, as described above with reference to fig. 13A. Although fig. 19 shows one example of a suitable robotic stylus system, the present techniques are not limited to such a system and other types of suitable systems may be used. For example, other such systems may include a stylus operably coupled to the robotic arm, and/or a stylus coupled to the solenoid and configured to move toward and away from the device display 1302 via operation of the solenoid. Additional robotic stylus systems are disclosed in U.S. provisional patent application No. 62/202,330 and U.S. non-provisional patent application No. 13/658,828, both of which are incorporated herein by reference in their entirety.
The following patents and patent applications are incorporated herein by reference in their entirety: U.S. patent nos.: 10,572,946, respectively; 10,475,002, respectively; 10,445,708, respectively; 10,438,174, respectively; 10,417,615, respectively; 10,401,411, respectively; 10,269,110, respectively; 10,157,427, respectively; 10,127,647, respectively; 10,055,798, respectively; 9,885,672, respectively; 9,881,284, respectively; 8,200,533, respectively; 8,195,511, respectively; and 7,881,965; U.S. patent application No.: 16/794,009, respectively; 16/788,169, respectively; 16/788,153, respectively; 16/719,699, respectively; 16/601,492, respectively; 16/575,090, respectively; 16/575,003, respectively; 16/556,104, respectively; 16/556,018, respectively; 16/534,741, respectively; 16/357,041, respectively; 16/195,785, respectively; 15/977,729, respectively; 15/901,526, respectively; 15/855,320, respectively; 15/672,157, respectively; 15/641,145, respectively; 15/630,460, respectively; 15/214,791, respectively; 15/091,487, respectively; 15/057,707, respectively; 14/967,183, respectively; 14/966,346, respectively; 14/964,963, respectively; 14/934,134, respectively; 14/663,331, respectively; 14/660,768, respectively; 14/598,469, respectively; 14/568,051, respectively; 14/498,763, respectively; 13/794,816, respectively; 13/794,814, respectively; 13/753,539, respectively; 13/733,984, respectively; 13/705,252, respectively; 13/693,032, respectively; 13/658,828, respectively; 13/658,825, respectively; 13/492,835, respectively; and 13/113,497; and U.S. provisional patent application No.: 63/116,007, respectively; 63/116,020, respectively; 63/066,794, respectively; 62/950,075, respectively; 62/807,165; 62/807,153, respectively; 62/804,714, respectively; 62/782,947, respectively; 62/782,302, respectively; 62/332,736, respectively; 62/221,510, respectively; 62/202,330, respectively; 62/169,072, respectively; 62/091,426, respectively; 62/090,855, respectively; 62/076,437, respectively; 62/073,847, respectively; 62/073,840, respectively; 62/059,132, respectively; 62/059,129, respectively; 61/607,572, respectively; 61/607,548, respectively; 61/607,001, respectively; 61/606,997, respectively; 61/595,154, respectively; 61/593,358, respectively; 61/583,232, respectively; 61/570,309, respectively; 61/551,410, respectively; 61/472,611, respectively; 61/347,635, respectively; 61/183,510, respectively; and 61/102,304. All patents and patent applications listed in the above sentence, as well as any other patents or patent applications identified herein, are hereby incorporated by reference in their entirety.
Aspects of the present technology may be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein. While aspects of the technology, such as certain functions, are described as being performed exclusively on a single device, the technology may also be practiced in distributed environments where functions or modules are shared among different processing devices that are linked through a communications network, such as a Local Area Network (LAN), Wide Area Network (WAN), or the Internet. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Aspects of the present technology may be stored or distributed on tangible computer-readable media, including magnetically or optically readable computer disks, hardwired or preprogrammed chips (e.g., EEPROM semiconductor chips), nanotechnology memory, biological memory, or other data storage media. Alternatively, computer implemented instructions, data structures, screen displays, and other data under aspects of the technology may be distributed over the Internet or other networks (including wireless networks) on a propagated signal on a propagation medium (e.g., an electromagnetic wave, a sound wave, etc.) over a period of time, or they may be provided over any analog or digital network (packet switched, circuit switched, or other scheme).
Reference throughout the foregoing description to features, advantages, or similar language does not imply that all of the features and advantages that may be realized with the present technology should be or are in any single embodiment of the technology. Rather, language referring to the features and advantages is understood to mean that a specific feature, advantage, or characteristic described in connection with an embodiment is included in at least one embodiment of the present technology. Thus, discussion of the features and advantages, and similar language, throughout this specification may, but do not necessarily, refer to the same embodiment.
Furthermore, the described features, advantages, and characteristics of the technology may be combined in any suitable manner in one or more embodiments. One skilled in the relevant art will recognize that the technology may be practiced without one or more of the specific features or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments of the technology.
Any patents and applications mentioned above, as well as other references, including those which may be listed in the accompanying specification, are incorporated herein by reference in their entirety, except to the extent that the material so incorporated is not inconsistent with the explicit disclosure herein, in which case the language in the art shall control. Aspects of the technology can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the technology.
The above detailed description of examples and embodiments of the present technology is not intended to be exhaustive or to limit the present technology to the precise form disclosed above. Although specific examples of the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while processes are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a number of different ways. Further, while processes are sometimes described as being performed in series, the processes may instead be performed or implemented in parallel or at different times.
The teachings of the present technology provided herein may be applied to other systems, not necessarily the systems described above. The various illustrated elements and acts described above may be combined to provide further embodiments of the present technology. Some alternative embodiments of the present technology may include not only the additional elements described above, but also fewer elements. Further, any particular number indicated herein is merely an example: different values or ranges may be used in alternative embodiments.
While the foregoing description describes various embodiments of the present technology and the best mode contemplated, no matter how detailed the above appears in text, the present technology can be practiced in many ways. The details of the system may vary considerably in its specific embodiments, but are still encompassed by the present technology. As noted above, particular terminology used when describing certain features or aspects of the technology should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the technology with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the technology to the specific examples disclosed in the specification, unless the above detailed description section explicitly defines such terms. Accordingly, the actual scope of the technology encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the technology under the claims.
From the foregoing it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but that various modifications may be made without deviating from the spirit and scope of various embodiments of the technology. Moreover, while various advantages associated with embodiments of the technology are described above in the context of these embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the technology is not limited except as by the appended claims.
While certain aspects of the technology are presented below in certain claim forms, applicants contemplate the various aspects of the technology in any number of claim forms. Accordingly, the applicants reserve the right to add additional claims after filing the application or to add such additional claim forms in the application or in continuation applications.
Claims (11)
1. A self-service terminal, characterized in that the self-service terminal comprises:
one or more processors; and
a non-transitory computer-readable medium storing instructions that, when executed by the one or more processors, cause the one or more processors to:
recording, by at least one camera of the self-service terminal, a visual representation of a mobile device placed in proximity to the at least one camera,
wherein the visual representation includes a representation of the at least one camera relative to the camera
Perspective distortion of the mobile device caused by the location of the mobile device; creating an augmented reality representation of the mobile device based on the visual representation by causing the one or more processors to perform the steps of:
correcting perspective distortion associated with the visual representation;
generating a message to guide a user to cause the mobile device to visually display additional information associated with the mobile device; and
combining the corrected visual representation and the message to generate the augmented reality representation;
providing, by a user interface of the self-service terminal, the created augmented reality representation to the user;
receiving visually provided additional information associated with the mobile device;
extracting device information from the visually provided additional information using Optical Character Recognition (OCR);
generating a first price for the mobile device and a second price for the mobile device based on the extracted device information,
wherein the first price represents a value of the mobile device when the mobile device is damaged,
wherein the second price represents a value of the mobile device when the mobile device is not damaged, an
Wherein the first price is lower than the second price; and
presenting the first price and the second price to the user.
2. The kiosk of claim 1, wherein the instructions further cause the one or more processors to:
receiving a selection of the second price from the user, wherein the representation of the amount of time represents a wait time to evaluate the mobile device and determine that the mobile device is damaged;
determining a condition of a mobile device; and
upon determining that the mobile device is not damaged, a second price is provided to the user.
3. The kiosk of claim 1 wherein the visual representation comprises at least one of an image or a video.
4. The kiosk of claim 1, wherein the instructions to create the augmented reality representation cause the one or more processors to:
generating the message to direct a user to log off from one or more personal user accounts associated with the mobile device; and
combining the corrected visual representation and the message to guide a user off-login to create the augmented reality representation.
5. The kiosk of claim 1 wherein the device information includes at least one of a brand of the mobile device, a model of the mobile device, an operator associated with the mobile device, or a storage capacity associated with the mobile device.
6. The kiosk of claim 1, wherein the instructions to generate the message to guide a user to visually provide the mobile device with additional information associated with the mobile device cause the one or more processors to:
determining a brand of the mobile device by determining a curvature of a corner of the mobile device from the visual representation; and
generating the message to guide the user to cause the mobile device to visually provide additional information associated with the mobile device based on a brand of the mobile device.
7. The kiosk of claim 1, wherein the instructions further cause the one or more processors to:
detecting that the mobile device is stolen by causing the one or more processors to perform the steps of:
receiving the mobile device which is shut down through a disc of the self-service terminal;
causing the self-service terminal to power on the mobile device;
determining, using optical character recognition, that the mobile device displays an alert indicating that the mobile device is reported stolen;
providing, by the user interface, a notification to the user to disable the displayed alert;
determining that the displayed alert is disabled; and
determining that the mobile device is stolen upon determining that the displayed alert is not disabled.
8. The kiosk of claim 1, wherein the instructions further cause the one or more processors to:
determining that the visual representation should be corrected based on an orientation of the mobile device or based on a state of a housing surrounding the mobile device.
9. The kiosk of claim 1, wherein the instructions further cause the one or more processors to:
enabling voice interaction with the user by providing recognizable voice commands to the user;
acquiring audio spoken by the user by a microphone of the self-service terminal; and
matching the audio spoken by the user with at least one of the recognizable voice commands.
10. A non-transitory computer-readable medium storing instructions that, when executed by at least one computing device of a kiosk, cause the at least one computing device to:
recording, by at least one camera of the self-service terminal, a visual representation of a mobile device placed in proximity to the at least one camera,
wherein the visual representation comprises perspective distortion of the mobile device due to a position of the at least one camera relative to the mobile device;
creating an augmented reality representation of the mobile device based on the visual representation by causing the at least one computing device to:
correcting perspective distortion associated with the visual representation;
generating a message to guide a user to cause the mobile device to visually display additional information associated with the mobile device; and
combining the corrected visual representation and the message to generate an augmented reality representation;
providing, by a user interface of the self-service terminal, the created augmented reality representation to a user;
receiving visually provided additional information associated with the mobile device;
extracting device information from the visually provided additional information using Optical Character Recognition (OCR);
generating a first price for the mobile device and a second price for the mobile device based on the extracted device information,
wherein the first price represents a value of the mobile device when the mobile device is damaged,
wherein the second price represents a value of the mobile device when the mobile device is not damaged, an
Wherein the first price is lower than the second price; and
presenting, by the user interface, the first price and the second price to the user.
11. The non-transitory computer-readable medium of claim 10, wherein the instructions further cause the at least one computing device to:
receiving, by the user interface, a selection of the second price from the user, wherein the representation of the amount of time represents a wait time to evaluate the mobile device and determine that the mobile device is damaged;
determining a condition of the mobile device; and
upon determining that the mobile device is not damaged, offering, by the user interface, the second price to the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202221861013.4U CN218788211U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for recycling mobile equipment |
Applications Claiming Priority (18)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063066794P | 2020-08-17 | 2020-08-17 | |
US63/066,794 | 2020-08-17 | ||
US202063116020P | 2020-11-19 | 2020-11-19 | |
US202063116007P | 2020-11-19 | 2020-11-19 | |
US63/116,007 | 2020-11-19 | ||
US63/116,020 | 2020-11-19 | ||
US17/445,083 US11922467B2 (en) | 2020-08-17 | 2021-08-13 | Evaluating an electronic device using optical character recognition |
US17/445,083 | 2021-08-13 | ||
US17/445,082 | 2021-08-13 | ||
USPCT/US2021/071192 | 2021-08-13 | ||
PCT/US2021/071191 WO2022040667A1 (en) | 2020-08-17 | 2021-08-13 | Evaluating an electronic device using a wireless charger |
US17/445,082 US12271929B2 (en) | 2020-08-17 | 2021-08-13 | Evaluating an electronic device using a wireless charger |
USPCT/US2021/071191 | 2021-08-13 | ||
PCT/US2021/071192 WO2022040668A1 (en) | 2020-08-17 | 2021-08-13 | Evaluating an electronic device using optical character recognition |
PCT/US2021/071200 WO2022040672A1 (en) | 2020-08-17 | 2021-08-16 | Kiosk for evaluating and purchasing used electronic devices |
US17/445,158 | 2021-08-16 | ||
USPCT/US2021/071200 | 2021-08-16 | ||
US17/445,158 US12033454B2 (en) | 2020-08-17 | 2021-08-16 | Kiosk for evaluating and purchasing used electronic devices |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202221861013.4U Division CN218788211U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for recycling mobile equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN217113409U true CN217113409U (en) | 2022-08-02 |
Family
ID=80283281
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202221861013.4U Active CN218788211U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for recycling mobile equipment |
CN202110945369.XA Pending CN114155645A (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for evaluating and purchasing used electronic equipment |
CN202121931846.9U Active CN217113409U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal and non-transitory computer readable medium |
CN202121931731.XU Active CN218446729U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for recycling electronic equipment |
CN202110944311.3A Pending CN114078291A (en) | 2020-08-17 | 2021-08-17 | Evaluating electronic devices using optical character recognition |
CN202110945919.8A Pending CN114154652A (en) | 2020-08-17 | 2021-08-17 | Evaluating electronic devices using wireless chargers |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202221861013.4U Active CN218788211U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for recycling mobile equipment |
CN202110945369.XA Pending CN114155645A (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for evaluating and purchasing used electronic equipment |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202121931731.XU Active CN218446729U (en) | 2020-08-17 | 2021-08-17 | Self-service terminal for recycling electronic equipment |
CN202110944311.3A Pending CN114078291A (en) | 2020-08-17 | 2021-08-17 | Evaluating electronic devices using optical character recognition |
CN202110945919.8A Pending CN114154652A (en) | 2020-08-17 | 2021-08-17 | Evaluating electronic devices using wireless chargers |
Country Status (1)
Country | Link |
---|---|
CN (6) | CN218788211U (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI839893B (en) * | 2022-10-14 | 2024-04-21 | 國立成功大學 | Field image remote monitoring system and fish-farm image remote monitoring system |
-
2021
- 2021-08-17 CN CN202221861013.4U patent/CN218788211U/en active Active
- 2021-08-17 CN CN202110945369.XA patent/CN114155645A/en active Pending
- 2021-08-17 CN CN202121931846.9U patent/CN217113409U/en active Active
- 2021-08-17 CN CN202121931731.XU patent/CN218446729U/en active Active
- 2021-08-17 CN CN202110944311.3A patent/CN114078291A/en active Pending
- 2021-08-17 CN CN202110945919.8A patent/CN114154652A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN218446729U (en) | 2023-02-03 |
CN114155645A (en) | 2022-03-08 |
CN218788211U (en) | 2023-04-04 |
CN114078291A (en) | 2022-02-22 |
CN114154652A (en) | 2022-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11922467B2 (en) | Evaluating an electronic device using optical character recognition | |
US12033454B2 (en) | Kiosk for evaluating and purchasing used electronic devices | |
US12271929B2 (en) | Evaluating an electronic device using a wireless charger | |
US12322259B2 (en) | Systems and methods for vending and/or purchasing mobile phones and other electronic devices | |
US20240185317A1 (en) | Kiosk for evaluating and purchasing electronic devices and associated methods of manufacture and use | |
US20240265364A1 (en) | Systems and methods for vending and/or purchasing mobile phones and other electronic devices | |
US20220068076A1 (en) | Kiosk for evaluating and purchasing used electronic devices | |
US9911102B2 (en) | Application for device evaluation and other processes associated with device recycling | |
CN115581122A (en) | System and method for vending and/or purchasing mobile phones and other electronic devices | |
US10572946B2 (en) | Methods and systems for facilitating processes associated with insurance services and/or other services for electronic devices | |
JP2015505999A (en) | Method and apparatus for recycling electronic devices | |
CN217113409U (en) | Self-service terminal and non-transitory computer readable medium | |
WO2022133498A1 (en) | Systems and methods for vending and/or purchasing mobile phones and other electronic devices | |
EP4196943B1 (en) | Evaluating an electronic device using a wireless charger | |
CN219936464U (en) | Device for mobile electronic equipment in self-service terminal and self-service terminal | |
WO2024173591A1 (en) | Kiosk for evaluating and purchasing electronic devices and associated methods of manufacture and use | |
CA3192240A1 (en) | Kiosk for evaluating and purchasing used electronic devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |