US20120036194A1 - Collaboration agent - Google Patents
Collaboration agent Download PDFInfo
- Publication number
- US20120036194A1 US20120036194A1 US13/277,781 US201113277781A US2012036194A1 US 20120036194 A1 US20120036194 A1 US 20120036194A1 US 201113277781 A US201113277781 A US 201113277781A US 2012036194 A1 US2012036194 A1 US 2012036194A1
- Authority
- US
- United States
- Prior art keywords
- user
- communications session
- collaboration agent
- information
- collaboration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1069—Session establishment or de-establishment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1076—Screening of IP real time communications, e.g. spam over Internet telephony [SPIT]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/54—Presence management, e.g. monitoring or registration for receipt of user log-on information, or the connection status of the users
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/48—Arrangements for recalling a calling subscriber when the wanted subscriber ceases to be busy
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/56—Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1881—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast with schedule organisation, e.g. priority, sequence management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/66—Arrangements for connecting between networks having differing types of switching systems, e.g. gateways
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
- H04L65/4046—Arrangements for multi-party communication, e.g. for conferences with distributed floor control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/42348—Location-based services which utilize the location information of a target
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/42365—Presence services providing information on the willingness to communicate or the ability to communicate in terms of media capability or network connectivity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/432—Arrangements for calling a subscriber at a specific time, e.g. morning call service
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/56—Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
- H04M3/563—User guidance or feature selection
- H04M3/565—User guidance or feature selection relating to time schedule aspects
Definitions
- This invention relates generally to communications, and in particular to a collaboration agent for managing real-time communications on behalf of a user.
- Managing real-time communications can be time-consuming and frustrating at times.
- individuals frequently feel compelled to respond to real-time communications as quickly as possible, but may be unable to do so because the individuals are currently engaged in communicating with someone else.
- an individual may be using a work phone to participate in a conference call when the individual is alerted that a call is coming in on a second line of the work phone.
- Other than observe the calling party's phone number there is little the individual can do.
- the calling party may then attempt to call the individual's cell phone, and when that is similarly unsuccessful in reaching the individual, attempt to send the individual a text message or an instant message.
- the individual may be aware of each of these attempts by the calling party to reach the individual, but be unable to communicate with the calling party without disrupting the conference call.
- the individual may have to contact multiple voice messaging systems to determine whether or not the calling party left a message and the purpose of the call.
- the individual may only later learn that the calling party had an urgent need to contact the individual for a reason that, had the individual been aware of it, would have caused the individual to terminate participation in the conference call to accept the call from the calling party.
- Business environments frequently have one or more sources of information that are useful in determining an individual's availability, presence, or location.
- most business environments enable their employees to use an electronic calendar to schedule and keep track of meetings.
- the electronic calendar may include information such as a time and date of the meeting, the attendees of the meeting, and a physical location or dial-in information associated with the meeting.
- Business environments may also enable their employees to communicate with each other via instant messaging, which may include features such as presence indicators which can be used to indicate availability and willingness-to-communicate states of the respective employee. While aspects of these information sources may be visible to other users of the same application, they are not typically used in a real-time application that manages voice communications and other real-time communications.
- such information may be very valuable in establishing and facilitating communications between individuals. For example, in the example provided previously, had the calling party reached a service that could access the individual's electronic calendar, the calling party may have been able to determine another time they would be likely to reach the individual. Alternately, had the calling party reached a service that could indicate the individual was present in their office based on presence information associated with the individual, the calling party may have been able to simply walk to the individual's office. In general, it would be extremely helpful in today's business environment if an intelligent agent could help manage an individual's real-time communications based on information maintained about the individual in one or more information sources.
- the present invention relates to a collaboration agent that manages real-time communications on behalf of a user.
- the collaboration agent includes a bridge, sometimes referred to as a conference bridge, and a contextual information interface adapted to obtain information about the user, such as availability, location, or presence.
- the collaboration agent anchors calls associated with the user to the conference bridge in a network rather than a user device.
- the user can interface with the collaboration agent via one or more end-user devices.
- the collaboration agent can interface with other collaboration agents to facilitate communications between associated users. For example, the collaboration agent can receive requests from other collaboration agents for available times during which the associated user is available for a conversation.
- the collaboration agent includes a speech recognition processor for receiving commands from the user.
- the collaboration agent also includes an ability to communicate with the user, or other users, via text-to-speech processing, speech generation, or playing recorded messages.
- the collaboration agent establishes, communications sessions between the conference bridge and a user device associated with the user, and other conference bridges associated with other users.
- the collaboration agent can mix multiple communications sessions associated with the conference bridge together, or can keep certain communications sessions separate from other communications sessions.
- the collaboration agent can deliver an audio signal to a user from multiple sources while preventing each source from hearing the audio associated with the other source.
- the collaboration agent obtains meeting information from a calendar associated with the user via the contextual information interface.
- the collaboration agent can use the meeting information to remind the user of imminent or existing meetings of which the user is an attendee.
- the collaboration agent can also automatically connect the user to another conference bridge associated with another user based on the meeting information, and indicate to the user they have been joined to the other conference bridge, without direction from the user.
- the collaboration agent can receive a request from a second user to establish a communications session between the conference bridge of the user and the conference bridge of the second user.
- the collaboration agent can determine from meeting information obtained via the contextual information interface that the user has a meeting scheduled at, or substantially near, the time of the request, and that the second user is an attendee of the meeting.
- the collaboration agent can automatically establish the communications session between the conference bridge of the first user and the conference bridge of the second user based on the meeting information and the identity of the second user.
- the collaboration agent can receive a request to establish a first communications session between the conference bridge of the user and a second conference bridge of a second user for the purposes of enabling the second user to communicate with the first user.
- the collaboration agent determines that the first user is not present and is travelling on business.
- the collaboration agent successfully establishes a second communications session between the conference bridges of the users and indicates that the second user desires to speak with the user.
- the collaboration agent Upon receiving an indication from the user, the collaboration agent enables communications between the first communications session and the second communications session, while maintaining the directory number of the user confidential.
- FIG. 1 is a block diagram illustrating a plurality of collaboration agents according to one embodiment of the invention.
- FIG. 2 is a block diagram illustrating a collaboration agent shown in FIG. 1 in greater detail.
- FIG. 3 is a block diagram illustrating two collaboration agents managing communications on behalf of respective users.
- FIG. 4 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to one embodiment of the invention.
- FIG. 5 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to another embodiment of the invention.
- FIG. 6 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the invention.
- FIG. 7 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the invention.
- the present invention enables real-time and near real-time communications, such as telephone calls, instant messaging (IM) messages, and text messages (e.g., Short Message Service (SMS) messages), to be managed on a user's behalf by an intelligent controller, referred to herein as a collaboration agent, which preferably operates in conjunction with other collaboration agents to simplify, manage, and facilitate communications between respective users.
- IM instant messaging
- SMS Short Message Service
- FIG. 1 is a block diagram illustrating a plurality of collaboration agents according to one embodiment of the invention.
- a plurality of collaboration agents 10 are coupled to a network 12 via a communications link 14 .
- the network 12 can comprise any combination of wired or wireless technologies, and can utilize any data transport technologies, such as Transmission Control Protocol/Internet Protocol (TCP/IP) or Sequence Packet Exchange/Internetwork Packet Exchange (SPX/IPX) that enable communications between the collaboration agents 10 .
- TCP/IP Transmission Control Protocol/Internet Protocol
- SPX/IPX Sequence Packet Exchange/Internetwork Packet Exchange
- the network 12 can, for example, comprise the Internet or can comprise a private enterprise network.
- the network 12 can also comprise a combination of networks, such as multiple private enterprise networks, that are coupled together to enable communications between multiple collaboration agents 10 associated with different, enterprises.
- the communications links 14 can comprise any suitable access communications links, such as wired or wireless access links including Ethernet, broadband cable or Digital Subscriber Line (DSL) lines, WiFi, and the like.
- Each collaboration agent 10 has an associated user 16 , and the collaboration agents 10 manage communications on behalf of the respective user 16 .
- the collaboration agents 10 use associated contextual information 18 to make decisions on behalf of the respective user 16 , or to locate the respective user 16 , as described in greater detail herein.
- the collaboration agents 10 can communicate with one another via messaging and predefined requests using a standard or conventional messaging protocol.
- each collaboration agent 10 includes a conference bridge 20 for anchoring calls associated with the user 16 .
- Each collaboration agent 10 can communicate with one or more user devices 22 , such as wired, cordless or cellular telephones, Personal Digital Assistants (PDAs), computers, and the like, to provide communications with the user 16 .
- PDAs Personal Digital Assistants
- FIG. 2 is a block diagram illustrating a collaboration agent shown in FIG. 1 in greater detail.
- the collaboration agent 10 preferably includes a speech processing interface 24 adapted to interface to speech processing technology that enables the collaboration agent 10 to receive instructions from a respective user 16 .
- the speech processing technology can be integral with the collaboration agent 10 , or can be a stand-alone speech recognition processor. If stand-alone, the speech recognition processor 24 can devote a speech recognition channel to each collaboration agent 10 upon initiation of the collaboration agent 10 , or can provide a speech recognition channel on an on-demand basis.
- speech recognition processing in an on-demand application please see U.S. patent application Ser. No.
- the speech processing interface enables a user 16 to communicate with the collaboration agent 10 on a hands-free basis and in an intuitive and natural manner.
- the collaboration agent 10 also preferably has text-to-speech capabilities and the ability to play prerecorded messages or generated messages on behalf of the user 16 to another user 16 , or to provide instructions or help to the respective user 16 .
- the collaboration agent 10 includes a control system 26 that includes a conventional or proprietary operating system capable of executing one or more programs loaded into a memory 28 that contain instructions suitable for carrying out the functionality described herein.
- the collaboration agent 10 comprises a media application server.
- the collaboration agent 10 interfaces to contextual information 18 that can include, for example, presence information 30 , location information 32 , or calendar information 34 .
- the presence information 30 can comprise any suitable information that indicates a presence state of the user 16 , or that implies a presence state of the user 16 . For example, many IM applications enable a user to indicate a presence state such as “available,” “busy,” “out to lunch,” and the like.
- the collaboration agent 10 can communicate with respective IM applications to obtain such presence information via any suitable protocols, such as Session Initiation Protocol (SIP), Extensible Messaging and Presence Protocol (XMPP), and the like. Alternately, the collaboration agent 10 may be able to infer a presence through other information. For example, when the user device 22 is off-hook, the collaboration agent 10 may infer a presence state of “busy” for the user 16 .
- SIP Session Initiation Protocol
- XMPP Extensible Messaging and Presence Protocol
- the collaboration agent 10 may be able to infer a presence through other information. For example, when the user device 22 is off-hook, the collaboration agent 10 may infer a presence state of “busy” for the user 16 .
- the location information 32 can include any information that identifies, or implies, a location of the user 16 .
- the collaboration agent 10 may interface with a Global Positioning System (GPS) device associated with the user 16 , or be able to determine a location through cellular telephone triangulation techniques known to those skilled in the art.
- GPS Global Positioning System
- some enterprises provide employees Radio Frequency Identification (RFID) tags that monitor the physical location of an employee and provide that information to a central database, or monitor devices used by the user 16 in a similar manner.
- RFID Radio Frequency Identification
- the user 16 may set a location to indicate that they are “in the office,” “at home,” “in the car,” and the like.
- the calendar information 34 comprises information associated with a respective electronic calendar, or other activity tracking mechanism, of the user 16 .
- Electronic calendars are widely used today to reserve meetings with other users' 16 .
- Electronic calendars such as Microsoft Outlook, enable users 16 with sufficient privileges to exchange meeting requests and reserve time slots on other users' 16 electronic calendars upon demand.
- Meeting requests typically include information such as the attendees of the meeting, location or dial-in information associated with the meeting, starting time and expected ending time of the meeting, and the like.
- Calendar information 34 can be obtained by the collaboration agent 10 using an appropriate application programming interface provided by the respective calendar provider, such as a Microsoft Outlook Application Programming Interface (MAPI), Google Calendar API, or a standard iCal calendar interface. While the presence information 30 , the location information 32 , and the calendar information 34 are shown separately in FIG.
- MIME Microsoft Outlook Application Programming Interface
- Google Calendar API or a standard iCal calendar interface
- each of the presence information 30 , the location information 32 , and the calendar information 34 constitute contextual information 18
- similar contextual information 18 may come from one or more of the presence information 30 , the location information 32 , and the calendar information 34 .
- the collaboration agent 10 may infer presence and availability by examining whether the user 16 is involved in a meeting from obtaining the calendar information 34 .
- contextual information While calendar, presence, and location information are provided as examples of contextual information used in the present invention, those skilled in the art will appreciate that the invention is not limited thereto. Any type of contextual information that may be accessible to the collaboration agent 10 for determining availability of a respective user 16 may be used.
- the collaboration agent 10 may use profile information associated with a cellular telephone of the user 16 that is accessible by the collaboration agent 10 , or electronic records identifying the user 16 as being “on” or “off” duty, for example.
- the collaboration agent 10 interfaces and controls the conference bridge 20 via one or more suitable protocols such as Call Control. Extensible Markup Language (CCXML), Media Resource Control Protocol (MRCP), Media Server Control (mediactrl), and the like.
- the collaboration agent 10 uses the conference bridge 20 as an anchor point for communications sessions associated with the user 16 .
- the conference bridge 20 includes a mixer, and enables certain communications sessions joined to the conference bridge 20 to be coupled with other communications sessions joined to the conference bridge 20 as desired. For example, a first communications session may exist between the user device 16 and the conference bridge 20 , and a second communications session may exist between the conference bridge 20 and another conference bridge 20 associated with another user 16 . The first and second communications sessions are joined together to enable the users 16 to converse with one another.
- the collaboration agent 10 may determine that a meeting obtained from the calendar information 34 begins in five minutes.
- the collaboration agent 10 can use the first communications session to inform the user 16 of the imminent meeting without enabling the audio signals to be provided to the second communications session so that the other user 16 is unaware that the first user 16 received an audible reminder of an imminent meeting.
- Another example of the mixing features of the conference bridge 20 includes the ability for certain users 16 to hear the audio signals of some users 16 but not other users 16 .
- a first user 16 has organized a first conference call with a first group of users 16 on the conference bridge 20 for 9:00 am-10:00 am, and a second conference call with a second group of users 16 on the conference bridge 20 for 10:00 am-11:00 am.
- a third user 16 from the second group of users 16 is joined to the conference bridge 20 for the second conference call beginning at 10:00 am.
- the conference bridge 20 does not provide the audio signals from the communications sessions associated with the first group of users 16 to the third user 16 awaiting the second conference call.
- the conference bridge 20 may announce to the first user 16 associated with the conference bridge 20 that the third user 16 has joined the conference bridge 20 and awaits the 10:00 am call.
- the conference bridge 20 can enable the first user 16 to welcome the third user 16 and advise the third user 16 that the call will be beginning shortly, while simultaneously preventing the second group of users 16 from hearing the discussion between the first user 16 and the third user 16 .
- Anchoring calls to the conference bridge 20 provides several notable features, as will be discussed herein. For example, if the user 16 determines that they need to switch from a work phone to a cellular phone in the middle of a conversation, the individual or individuals with whom the user 16 is currently speaking via communications sessions joined to the conference bridge 20 need not redial or be inconvenienced. The user 16 only needs to establish a new communications session between the cell pone and the conference bridge 20 . The new communications session can be initiated by the user 16 dialing a telephone number associated with the conference bridge 20 , or the collaboration agent 10 , or by asking the collaboration agent 10 to initiate a call to the cell phone associated with the user 16 .
- FIG. 3 is a block diagram illustrating two collaboration agents managing communications on behalf of respective users.
- FIG. 3 will be used herein to illustrate several ways in which the collaboration agents 10 can manage and facilitate real-time communications on behalf of a respective user 16 .
- FIG. 3 will also be used in conjunction with FIGS. 4-7 to illustrate particular embodiments disclosed therein. While FIG. 3 illustrates a collaboration agent 10 A and a collaboration agent 10 B, the collaboration agents 10 A, 10 B may be referred to collectively as the collaboration agents 10 where the discussion does not relate to a particular collaboration agent 10 A or collaboration agent 10 B.
- Each collaboration agent 10 is preferably a routing point for all real-time communications destined for a respective user 16 .
- This can be managed, for example, by forwarding telephone devices to a telephone number associated with the respective collaboration agent 10 and having IM messages forwarded from the respective IM applications to an address associated with the respective collaboration agent 10 via a network feature or other mechanisms known to those skilled in the art.
- Such configuration can be handled by a user 16 or by an administrator when setting up a collaboration agent 10 for use by a particular user 16 .
- the collaboration agent 10 receives textual communications, such as IM and text messages, and can provide them to the user 16 based on the contextual information 18 associated with the user 16 . For example, assume that the user 16 is driving home from the office and the collaboration agent 10 receives an IM message for the user 16 . The collaboration agent 10 can obtain contextual information 18 and determine that the user 16 is no longer in the office. The collaboration agent 10 can use the conference bridge 20 to attempt to contact the user 16 via the user's 16 cell phone. Assuming that the user 16 answers the cell phone, the collaboration agent 10 can provide a prerecorded message to the user 16 saying “An IM message has arrived.” The collaboration agent 10 couples the communications session between the conference bridge 20 and the user device 22 to the speech processing interface 24 .
- IM and text messages For example, assume that the user 16 is driving home from the office and the collaboration agent 10 receives an IM message for the user 16 . The collaboration agent 10 can obtain contextual information 18 and determine that the user 16 is no longer in the office. The collaboration agent 10 can use the conference bridge
- the user 16 may say “read the message.”
- the speech processing interface 24 detects the command “read” and provides this information to the collaboration agent 10 .
- the collaboration agent 10 uses text-to-speech processing to read the IM message to the user 16 .
- the user 16 may reply “Send IM. Thanks for the message, I agree.”
- the collaboration agent 10 via the speech processing interface 24 , recognizes the command to create and send an IM message, and uses speech-to-text processing to convert the speech “Thanks for the message, I agree” to a textual format.
- the collaboration agent 10 then responds to the IM message with an IM message saying “Thanks for the message, I agree.”
- the collaboration agent 10 A can communicate with the collaboration agent 10 B on behalf of the user 16 A as appropriate.
- the collaboration agent 10 B may initiate a request to open a voice communications session between the conference bridge 20 A and the conference bridge 20 B to enable the user 16 B to converse with the user 16 A.
- the collaboration agent 10 A may determine that the user 16 A is currently on the phone, and may determine from the calendar information 34 A that the user 16 A is engaged in a meeting until 3:00 PM, but is available at 3:00 PM for a conversation with the user 16 B.
- the collaboration agent 10 A can send a message to the user 16 B that the user 16 A is not available, but that the user 16 A can speak with the user 16 B at 3:00 PM.
- the collaboration agent 10 B can determine if the user 16 B is available at 3:00 PM and, if so, can confirm a meeting with the collaboration agent 10 A, and each collaboration agent 10 A and 10 B can update the respective calendar to record the meeting at 3:00 PM.
- Each collaboration agent 10 may have rules, or filters, than can affect how the collaboration agent 10 manages or facilitates a real-time communication based on a particular criterion. For example, assume in the previous example the user 16 B is the manager of the user 16 A. The user 16 A may set up a rule stating that any attempts from the user 16 B to contact the user 16 A are to be communicated to the user 16 A if the user 16 A is present at a user device 22 .
- the collaboration agent 10 A may then “whisper” or communicate to the user 16 A via the conference bridge 20 A that the user 16 B is attempting to contact the user 16 A.
- the whisper can not be heard by the other participants of the call in which the user 16 A is currently engaged.
- the user 16 A may determine that his presence is not needed on the call, terminate his participation in the call, and indicate to the collaboration agent 10 A to establish the communications session between the conference bridge 20 A and the conference bridge 20 B to enable the user 16 B to communicate with the user 16 A.
- the collaboration agent 10 is preferably always executing on behalf of the user 16 .
- the collaboration agent 10 can make decisions on behalf of the user 16 to either arrange meetings in the future, or attempt to contact the user 16 based on criteria associated with a respective real-time communication.
- FIG. 4 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to one embodiment of the invention.
- FIG. 4 will be discussed in conjunction with FIG. 3 .
- the user 16 A wishes to communicate with the user 16 B.
- the user 16 A utters a command or a hot word into the user device 22 A, which is coupled to the conference bridge 20 A via a first communications session 36 A.
- the user 16 A remains coupled to the conference bridge 20 A via the first communications session 36 A during normal business hours so that the collaboration agent 10 A can communicate to the user 16 A via the user device 22 A based on the receipt of real-time communications.
- the user 16 A when the user 16 A arrives at the office each morning, he uses a work telephone to dial into a telephone number associated with a respective collaboration agent 10 A to establish a communications session between the work telephone and the collaboration agent 10 A.
- the user 16 A leaves the work telephone connected to the collaboration agent 10 A throughout the day, and enables a speaker phone capability and hands-free speaking capability of the work telephone, so the user 16 A can communicate with the collaboration agent 10 A by merely speaking in proximity of the telephone, and similarly can hear information provided by the collaboration agent 10 A when in proximity of the work telephone.
- the collaboration agent 10 A recognizes a command from the user 16 A and prompts the user 16 A for a particular command.
- the user 16 A indicates a desire to contact the user 166 by identifying the user 16 B by name (step 100 ).
- the collaboration agent 10 A searches a contacts list associated with the user 16 A for the name of the user 16 B (step 102 ). If the collaboration agent 10 A cannot locate the name of the user 16 B in the contacts list (step 104 ), the collaboration agent 10 A may refer to an enterprise-wide directory, such as a Lightweight Directory Access Protocol (LDAP) directory (step 106 ). Assume that the collaboration-agent 10 A finds multiple names that are similar or identical to the name of the user 16 B (step 108 ).
- LDAP Lightweight Directory Access Protocol
- the collaboration agent 10 A can provide, via the first communications session 36 A; a message to the user 16 A requesting clarification, such as, for example, a last name associated with the user 16 B, to remove the ambiguity (step 110 ).
- the user 16 A can indicate the complete name of the user 166 , removing the ambiguity.
- the collaboration agent 10 A can pull, or otherwise extract, a collaboration agent profile associated with the collaboration agent 10 B from a database configured to store public profiles of users (step 112 ).
- the collaboration agent 10 A can obtain an address of the collaboration agent 10 B by, for example, by using the identity of the user 16 B in a directory lookup as is known by those skilled in the art.
- the collaboration agent 10 A issues a QueryAvailability request to the collaboration agent 10 B (step 114 ).
- the collaboration agent 10 A receives a request from the collaboration agent 10 B that includes one or more of presence information 30 B, location information 32 B, or calendar information 34 B (step 116 ). If, from the provided information, the collaboration agent 10 A determines that the user 16 B is available for communication, the collaboration agent 10 A can issue an AuthorizeCommunication request to the collaboration agent 10 B (step 118 ). The collaboration agent 10 B may review one or more rules associated with the user 16 A and, for example, contact the user 16 B to determine whether the user 16 B desires to communicate with the user 16 A. Assume that the collaboration agent 10 B has determined that the user 16 B desires to speak to the user 16 A.
- the collaboration agent 10 B sends a response indicating an approval to talk with the user 16 A indicating that a voice channel is a desired communications mechanism (step 120 ).
- the collaboration agents 10 A, 10 B then initiate a communications session 36 C between the conference bridge 20 A and the conference bridge 20 B enabling the user 16 A to communicate via the conference bridges 20 A, 20 B and communications sessions 36 A, 36 B, 36 C (step 122 ).
- FIG. 5 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to another embodiment of the present invention.
- the user 16 A desires to communicate with the user 16 B, but that the user 16 B, unbeknownst to the user 16 A, is traveling.
- Steps 200 through 208 may be identical to steps 100 through 116 discussed with reference to FIG. 4 , and will not be repeated herein.
- the response from the collaboration agent 10 B indicates that the user 16 B is out of the office and traveling, but that a DoNotDisturb flag is reset, indicating that the user 16 B may be contacted, if possible (step 210 ).
- the collaboration agent 10 A issues an AuthorizeCommunication request to the collaboration agent 10 B (step 212 ).
- the collaboration agent 10 B based on the contextual information 18 B, determines that the user 16 B is out of the office and traveling, and attempts to contact the user 16 B via one or more user devices 22 B associated with the user 16 B (step 214 ). Notably, the collaboration agent 10 B attempts to contact the user 16 B without providing personal information, such as cell phone numbers associated with the user 16 B, to the collaboration agent 10 A.
- the collaboration agent 10 B is able to establish a communications session 36 B between the conference bridge 20 B and a user device 22 B associated with the user 16 B.
- the collaboration agent 10 B informs the collaboration agent 10 A to establish a communications session 36 C between the conference bridge 20 A and the conference bridge 20 B, enabling the user 16 A to converse with the user 16 B (step 216 ).
- the collaboration agent 10 B can inform the collaboration agent 10 A that the user 16 B is currently unavailable (step 218 ).
- the collaboration agents 10 A, 10 B could exchange one or more requests, such as FindFreeMeeting, ReserveMeeting, NotifyWhenAvailable, or LeaveVoiceMessage requests.
- the FindFreeMeeting and ReserveMeeting requests can be used by the collaboration agents 10 A, 10 B to obtain calendar information and negotiate a time for a meeting when both users 16 A, 16 B are available for a meeting, and to reserve the negotiated time on the respective calendars associated with the users 16 A, 16 B.
- the NotifyWhenAvailable request can be used to indicate to the collaboration agent 10 B to notify the collaboration agent 10 A when the contextual information 18 B associated with the user 16 B indicates that the user 16 B is present and available.
- the collaboration agent 10 A can inform the user 16 A that the user 16 B appears to be present and available, enabling the user 16 A to again attempt to contact the user 16 B.
- the LeaveVoiceMessage request can be used to enable the user 16 A to, leave a voice mail message for the user 16 B.
- FIG. 6 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the present invention.
- the collaboration agent 10 A determines that the user 16 A has an imminent meeting based on contextual information 18 A obtained from the calendar information 34 A (step 300 ).
- the collaboration agent 10 A determines that the user 16 A is in the midst of a communication with another user 16 .
- the collaboration agent 10 A whispers a meeting reminder via the first communications session 36 A to the user 16 A via the user device 22 A that the user 16 A has an imminent meeting (step 302 ).
- the collaboration agent 10 A then receives a request from the collaboration agent 10 B to establish a communications session between the conference bridge 20 A and the conference bridge 206 (step 304 ).
- the collaboration agent 10 A obtains meeting information from the calendar information 34 A (step 306 ).
- the collaboration agent 10 A determines that the user 16 B is an attendee of the meeting that is imminent (step 308 ).
- the collaboration agent 10 A establishes a communications session between the conference bridge 20 A and the conference bridge 20 B enabling the user 16 B to participate in the meeting at the time of the meeting (step 310 ).
- the collaboration agent 10 A also whispers to the user 16 A via the first communications session 36 A that the user 16 B has joined the conference bridge 20 A (step 312 ).
- FIG. 7 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the present invention.
- the collaboration agent 10 A receives a request from the collaboration agent 10 B to establish a communications session between the conference bridge 20 A and the conference bridge 20 B to enable the user 16 B to converse with the user 16 A (step 400 ).
- the collaboration agent 10 A determines, based on contextual information 18 A, that the user 16 A is available for communication with the user 16 B (step 402 ). However, the collaboration agent 10 A determines that a rule exists that requires actual confirmation from the user 16 A prior to establishing a communications session to enable conversations with the user 16 B.
- Rules can be maintained in a storage device or memory accessible to the respective collaboration agent 10 . Rules can be set up by a respective user 16 based on a variety of criteria such as calling party, time of day, and the like.
- the collaboration agent 10 A identifies the user 16 B to the user 16 A via the communications session 36 A (step
- the user 16 A initiates a command to the collaboration agent 10 A requesting additional information, such as the nature of the call, from the user 16 B (step 406 ).
- the collaboration agent 10 A establishes a communications session 36 C between the conference bridge 20 A and the conference bridge 20 B (step 408 ) and provides the question from the user 16 A to the user 16 B via the communications session 36 C (step 410 ).
- the user 16 B responds with a reply to the request, and the collaboration agent 10 A provides the audio signals from the user 16 B to the first communications session 36 A so that the user 16 A can hear the response from the user 16 B (step 412 ).
- the user 16 A indicates approval to the collaboration agent 10 A for a conversation with the user 16 B (step 414 ).
- the collaboration agent 10 A enables communications between the communications session 36 A and the communications session 36 C to enable the user 16 A to converse with the user 16 B (step 416 ).
- the collaboration agent 10 A enables the user 16 A to monitor a plurality of conference calls simultaneously.
- the user 16 A is attending a first conference call via the conference bridge 20 A.
- the first conference call is scheduled from 9:00 am to 11:00 am.
- the user 16 A also has a second conference call that he is expected to attend that is scheduled from 10:00 am to 11:00 am.
- the collaboration agent 10 A initiates a communications session with the conference bridge 20 that is hosting the second conference call.
- the collaboration agent 10 A informs the user 16 A that he has been joined to the second conference call, and mutes the outgoing audio associated with the first conference call so that attendees of the first conference call cannot hear the user 16 A, and enables bi-directional audio on the second conference call so that the user 16 A can indicate his presence to the attendees of the second conference call.
- the user 16 A can then issue a command to the collaboration agent 10 A to monitor the second conference call, and the collaboration agent 10 A can enable bi-directional communications again for the first conference call, and inhibit outgoing audio for the second conference call so that attendees of the second conference call cannot hear the user 16 A or the first conference call.
- the user 16 A can now participate fully in the first conference call, while simultaneously monitoring the second conference call.
- the user 16 A determines that he must respond to a remark made on the second conference call.
- the user 16 A issues a command to the collaboration agent 10 A, and in response, the collaboration agent 10 A inhibits the outgoing audio associated with the first conference call, and enables bi-directional audio with the second conference call.
- the user 16 A speaks to the attendees of the second conference call, and the attendees of the first conference call cannot hear the user 16 A, although the user 16 A can continue to hear the audio associated with the first conference call simultaneously while the user 16 A fully participates in the second conference call.
- the collaboration agent 10 A can issue a prerecorded message to either conference call as the user 16 A moves from active to passive participation. For example, when the user 16 A indicates he desires to speak to the attendees of the second conference call, the collaboration agent 10 A may play a message to the attendees of the first conference call that indicates that the user 16 A is departing the conference call temporarily. When the user 16 A returns to full participation in the first conference call, the collaboration agent 10 A can issue another prerecorded message to the attendees of the first conference call that the user 16 A has returned to the conference call.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Telephonic Communication Services (AREA)
- Telephone Function (AREA)
Abstract
Description
- The present application is a continuation of co-pending U.S. patent application Ser. No. 12/344,914, filed Dec. 29, 2008, entitled “COLLABORATION AGENT.”
- This invention relates generally to communications, and in particular to a collaboration agent for managing real-time communications on behalf of a user.
- Managing real-time communications, such as telephone calls, text messages, and instant messaging (IM) messages, can be time-consuming and frustrating at times. In a business environment, individuals frequently feel compelled to respond to real-time communications as quickly as possible, but may be unable to do so because the individuals are currently engaged in communicating with someone else. For example, an individual may be using a work phone to participate in a conference call when the individual is alerted that a call is coming in on a second line of the work phone. Other than observe the calling party's phone number, there is little the individual can do. The calling party may then attempt to call the individual's cell phone, and when that is similarly unsuccessful in reaching the individual, attempt to send the individual a text message or an instant message. The individual may be aware of each of these attempts by the calling party to reach the individual, but be unable to communicate with the calling party without disrupting the conference call. When the conference call is over, the individual may have to contact multiple voice messaging systems to determine whether or not the calling party left a message and the purpose of the call. The individual may only later learn that the calling party had an urgent need to contact the individual for a reason that, had the individual been aware of it, would have caused the individual to terminate participation in the conference call to accept the call from the calling party.
- Business environments frequently have one or more sources of information that are useful in determining an individual's availability, presence, or location. For example, most business environments enable their employees to use an electronic calendar to schedule and keep track of meetings. For a particular meeting, the electronic calendar may include information such as a time and date of the meeting, the attendees of the meeting, and a physical location or dial-in information associated with the meeting. Business environments may also enable their employees to communicate with each other via instant messaging, which may include features such as presence indicators which can be used to indicate availability and willingness-to-communicate states of the respective employee. While aspects of these information sources may be visible to other users of the same application, they are not typically used in a real-time application that manages voice communications and other real-time communications. However, such information may be very valuable in establishing and facilitating communications between individuals. For example, in the example provided previously, had the calling party reached a service that could access the individual's electronic calendar, the calling party may have been able to determine another time they would be likely to reach the individual. Alternately, had the calling party reached a service that could indicate the individual was present in their office based on presence information associated with the individual, the calling party may have been able to simply walk to the individual's office. In general, it would be extremely helpful in today's business environment if an intelligent agent could help manage an individual's real-time communications based on information maintained about the individual in one or more information sources.
- The present invention relates to a collaboration agent that manages real-time communications on behalf of a user. The collaboration agent includes a bridge, sometimes referred to as a conference bridge, and a contextual information interface adapted to obtain information about the user, such as availability, location, or presence. The collaboration agent anchors calls associated with the user to the conference bridge in a network rather than a user device. The user can interface with the collaboration agent via one or more end-user devices. The collaboration agent can interface with other collaboration agents to facilitate communications between associated users. For example, the collaboration agent can receive requests from other collaboration agents for available times during which the associated user is available for a conversation.
- According to one embodiment of the invention, the collaboration agent includes a speech recognition processor for receiving commands from the user. The collaboration agent also includes an ability to communicate with the user, or other users, via text-to-speech processing, speech generation, or playing recorded messages. The collaboration agent establishes, communications sessions between the conference bridge and a user device associated with the user, and other conference bridges associated with other users. The collaboration agent can mix multiple communications sessions associated with the conference bridge together, or can keep certain communications sessions separate from other communications sessions. The collaboration agent can deliver an audio signal to a user from multiple sources while preventing each source from hearing the audio associated with the other source.
- According to one embodiment of the invention, the collaboration agent obtains meeting information from a calendar associated with the user via the contextual information interface. The collaboration agent can use the meeting information to remind the user of imminent or existing meetings of which the user is an attendee. The collaboration agent can also automatically connect the user to another conference bridge associated with another user based on the meeting information, and indicate to the user they have been joined to the other conference bridge, without direction from the user.
- According to another embodiment of the invention, the collaboration agent can receive a request from a second user to establish a communications session between the conference bridge of the user and the conference bridge of the second user. The collaboration agent can determine from meeting information obtained via the contextual information interface that the user has a meeting scheduled at, or substantially near, the time of the request, and that the second user is an attendee of the meeting. The collaboration agent can automatically establish the communications session between the conference bridge of the first user and the conference bridge of the second user based on the meeting information and the identity of the second user.
- According to yet another embodiment of the invention, the collaboration agent can receive a request to establish a first communications session between the conference bridge of the user and a second conference bridge of a second user for the purposes of enabling the second user to communicate with the first user. The collaboration agent determines that the first user is not present and is travelling on business. The collaboration agent successfully establishes a second communications session between the conference bridges of the users and indicates that the second user desires to speak with the user. Upon receiving an indication from the user, the collaboration agent enables communications between the first communications session and the second communications session, while maintaining the directory number of the user confidential.
- Those skilled in the art will appreciate the scope of the present invention and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.
- The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the invention, and together with the description serve to explain the principles of the invention.
-
FIG. 1 is a block diagram illustrating a plurality of collaboration agents according to one embodiment of the invention. -
FIG. 2 is a block diagram illustrating a collaboration agent shown inFIG. 1 in greater detail. -
FIG. 3 is a block diagram illustrating two collaboration agents managing communications on behalf of respective users. -
FIG. 4 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to one embodiment of the invention. -
FIG. 5 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to another embodiment of the invention. -
FIG. 6 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the invention. -
FIG. 7 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the invention. - The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the invention and illustrate the best mode of practicing the invention. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the invention and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.
- The present invention enables real-time and near real-time communications, such as telephone calls, instant messaging (IM) messages, and text messages (e.g., Short Message Service (SMS) messages), to be managed on a user's behalf by an intelligent controller, referred to herein as a collaboration agent, which preferably operates in conjunction with other collaboration agents to simplify, manage, and facilitate communications between respective users. While the invention described herein has primary applicability in a business environment and will be discussed in the context of a business environment, those skilled in the art will recognize applicability of the invention in other environments, and the invention is not limited to use in a business environment.
-
FIG. 1 is a block diagram illustrating a plurality of collaboration agents according to one embodiment of the invention. A plurality ofcollaboration agents 10 are coupled to anetwork 12 via acommunications link 14. Thenetwork 12 can comprise any combination of wired or wireless technologies, and can utilize any data transport technologies, such as Transmission Control Protocol/Internet Protocol (TCP/IP) or Sequence Packet Exchange/Internetwork Packet Exchange (SPX/IPX) that enable communications between thecollaboration agents 10. Thenetwork 12 can, for example, comprise the Internet or can comprise a private enterprise network. Thenetwork 12 can also comprise a combination of networks, such as multiple private enterprise networks, that are coupled together to enable communications betweenmultiple collaboration agents 10 associated with different, enterprises. The communications links 14 can comprise any suitable access communications links, such as wired or wireless access links including Ethernet, broadband cable or Digital Subscriber Line (DSL) lines, WiFi, and the like. Eachcollaboration agent 10 has an associateduser 16, and thecollaboration agents 10 manage communications on behalf of therespective user 16. Thecollaboration agents 10 use associatedcontextual information 18 to make decisions on behalf of therespective user 16, or to locate therespective user 16, as described in greater detail herein. Thecollaboration agents 10 can communicate with one another via messaging and predefined requests using a standard or conventional messaging protocol. Notably, eachcollaboration agent 10 includes aconference bridge 20 for anchoring calls associated with theuser 16. Eachcollaboration agent 10 can communicate with one ormore user devices 22, such as wired, cordless or cellular telephones, Personal Digital Assistants (PDAs), computers, and the like, to provide communications with theuser 16. -
FIG. 2 is a block diagram illustrating a collaboration agent shown inFIG. 1 in greater detail. Thecollaboration agent 10 preferably includes a speech processing interface 24 adapted to interface to speech processing technology that enables thecollaboration agent 10 to receive instructions from arespective user 16. The speech processing technology can be integral with thecollaboration agent 10, or can be a stand-alone speech recognition processor. If stand-alone, the speech recognition processor 24 can devote a speech recognition channel to eachcollaboration agent 10 upon initiation of thecollaboration agent 10, or can provide a speech recognition channel on an on-demand basis. For one example of using speech recognition processing in an on-demand application, please see U.S. patent application Ser. No. 12/341,246, entitled METHOD AND SYSTEM FOR DETECTING A RELEVANT UTTERANCE, filed Dec. 22, 2008, which is hereby incorporated herein in its entirety. The speech processing interface, according to one embodiment of the invention, enables auser 16 to communicate with thecollaboration agent 10 on a hands-free basis and in an intuitive and natural manner. Thecollaboration agent 10 also preferably has text-to-speech capabilities and the ability to play prerecorded messages or generated messages on behalf of theuser 16 to anotheruser 16, or to provide instructions or help to therespective user 16. - The
collaboration agent 10 includes acontrol system 26 that includes a conventional or proprietary operating system capable of executing one or more programs loaded into amemory 28 that contain instructions suitable for carrying out the functionality described herein. According to one embodiment of the invention, thecollaboration agent 10 comprises a media application server. Thecollaboration agent 10 interfaces tocontextual information 18 that can include, for example,presence information 30,location information 32, orcalendar information 34. Thepresence information 30 can comprise any suitable information that indicates a presence state of theuser 16, or that implies a presence state of theuser 16. For example, many IM applications enable a user to indicate a presence state such as “available,” “busy,” “out to lunch,” and the like. Thecollaboration agent 10 can communicate with respective IM applications to obtain such presence information via any suitable protocols, such as Session Initiation Protocol (SIP), Extensible Messaging and Presence Protocol (XMPP), and the like. Alternately, thecollaboration agent 10 may be able to infer a presence through other information. For example, when theuser device 22 is off-hook, thecollaboration agent 10 may infer a presence state of “busy” for theuser 16. - The
location information 32 can include any information that identifies, or implies, a location of theuser 16. For example, thecollaboration agent 10 may interface with a Global Positioning System (GPS) device associated with theuser 16, or be able to determine a location through cellular telephone triangulation techniques known to those skilled in the art. Alternately, some enterprises provide employees Radio Frequency Identification (RFID) tags that monitor the physical location of an employee and provide that information to a central database, or monitor devices used by theuser 16 in a similar manner. Alternately, theuser 16 may set a location to indicate that they are “in the office,” “at home,” “in the car,” and the like. - The
calendar information 34 comprises information associated with a respective electronic calendar, or other activity tracking mechanism, of theuser 16. Electronic calendars are widely used today to reserve meetings with other users' 16. Electronic calendars, such as Microsoft Outlook, enableusers 16 with sufficient privileges to exchange meeting requests and reserve time slots on other users' 16 electronic calendars upon demand. Meeting requests typically include information such as the attendees of the meeting, location or dial-in information associated with the meeting, starting time and expected ending time of the meeting, and the like.Calendar information 34 can be obtained by thecollaboration agent 10 using an appropriate application programming interface provided by the respective calendar provider, such as a Microsoft Outlook Application Programming Interface (MAPI), Google Calendar API, or a standard iCal calendar interface. While thepresence information 30, thelocation information 32, and thecalendar information 34 are shown separately inFIG. 2 for purposes of illustration, each of thepresence information 30, thelocation information 32, and thecalendar information 34 constitutecontextual information 18, and similarcontextual information 18 may come from one or more of thepresence information 30, thelocation information 32, and thecalendar information 34. For example, thecollaboration agent 10 may infer presence and availability by examining whether theuser 16 is involved in a meeting from obtaining thecalendar information 34. - While calendar, presence, and location information are provided as examples of contextual information used in the present invention, those skilled in the art will appreciate that the invention is not limited thereto. Any type of contextual information that may be accessible to the
collaboration agent 10 for determining availability of arespective user 16 may be used. For example, thecollaboration agent 10 may use profile information associated with a cellular telephone of theuser 16 that is accessible by thecollaboration agent 10, or electronic records identifying theuser 16 as being “on” or “off” duty, for example. - The
collaboration agent 10 interfaces and controls theconference bridge 20 via one or more suitable protocols such as Call Control. Extensible Markup Language (CCXML), Media Resource Control Protocol (MRCP), Media Server Control (mediactrl), and the like. Thecollaboration agent 10 uses theconference bridge 20 as an anchor point for communications sessions associated with theuser 16. Theconference bridge 20 includes a mixer, and enables certain communications sessions joined to theconference bridge 20 to be coupled with other communications sessions joined to theconference bridge 20 as desired. For example, a first communications session may exist between theuser device 16 and theconference bridge 20, and a second communications session may exist between theconference bridge 20 and anotherconference bridge 20 associated with anotheruser 16. The first and second communications sessions are joined together to enable theusers 16 to converse with one another. Thecollaboration agent 10 may determine that a meeting obtained from thecalendar information 34 begins in five minutes. Thecollaboration agent 10 can use the first communications session to inform theuser 16 of the imminent meeting without enabling the audio signals to be provided to the second communications session so that theother user 16 is unaware that thefirst user 16 received an audible reminder of an imminent meeting. - Another example of the mixing features of the
conference bridge 20 includes the ability forcertain users 16 to hear the audio signals of someusers 16 but notother users 16. For example, assume that afirst user 16 has organized a first conference call with a first group ofusers 16 on theconference bridge 20 for 9:00 am-10:00 am, and a second conference call with a second group ofusers 16 on theconference bridge 20 for 10:00 am-11:00 am. At 9:55 am, before the first conference call has concluded, athird user 16 from the second group ofusers 16 is joined to theconference bridge 20 for the second conference call beginning at 10:00 am. However, because thethird user 16 is not an attendee of the first conference call, theconference bridge 20 does not provide the audio signals from the communications sessions associated with the first group ofusers 16 to thethird user 16 awaiting the second conference call. Theconference bridge 20 may announce to thefirst user 16 associated with theconference bridge 20 that thethird user 16 has joined theconference bridge 20 and awaits the 10:00 am call. Theconference bridge 20 can enable thefirst user 16 to welcome thethird user 16 and advise thethird user 16 that the call will be beginning shortly, while simultaneously preventing the second group ofusers 16 from hearing the discussion between thefirst user 16 and thethird user 16. - Anchoring calls to the
conference bridge 20 provides several notable features, as will be discussed herein. For example, if theuser 16 determines that they need to switch from a work phone to a cellular phone in the middle of a conversation, the individual or individuals with whom theuser 16 is currently speaking via communications sessions joined to theconference bridge 20 need not redial or be inconvenienced. Theuser 16 only needs to establish a new communications session between the cell pone and theconference bridge 20. The new communications session can be initiated by theuser 16 dialing a telephone number associated with theconference bridge 20, or thecollaboration agent 10, or by asking thecollaboration agent 10 to initiate a call to the cell phone associated with theuser 16. -
FIG. 3 is a block diagram illustrating two collaboration agents managing communications on behalf of respective users.FIG. 3 will be used herein to illustrate several ways in which thecollaboration agents 10 can manage and facilitate real-time communications on behalf of arespective user 16.FIG. 3 will also be used in conjunction withFIGS. 4-7 to illustrate particular embodiments disclosed therein. WhileFIG. 3 illustrates acollaboration agent 10A and acollaboration agent 10B, thecollaboration agents collaboration agents 10 where the discussion does not relate to aparticular collaboration agent 10A orcollaboration agent 10B. Eachcollaboration agent 10 is preferably a routing point for all real-time communications destined for arespective user 16. This can be managed, for example, by forwarding telephone devices to a telephone number associated with therespective collaboration agent 10 and having IM messages forwarded from the respective IM applications to an address associated with therespective collaboration agent 10 via a network feature or other mechanisms known to those skilled in the art. Such configuration can be handled by auser 16 or by an administrator when setting up acollaboration agent 10 for use by aparticular user 16. - The
collaboration agent 10 receives textual communications, such as IM and text messages, and can provide them to theuser 16 based on thecontextual information 18 associated with theuser 16. For example, assume that theuser 16 is driving home from the office and thecollaboration agent 10 receives an IM message for theuser 16. Thecollaboration agent 10 can obtaincontextual information 18 and determine that theuser 16 is no longer in the office. Thecollaboration agent 10 can use theconference bridge 20 to attempt to contact theuser 16 via the user's 16 cell phone. Assuming that theuser 16 answers the cell phone, thecollaboration agent 10 can provide a prerecorded message to theuser 16 saying “An IM message has arrived.” Thecollaboration agent 10 couples the communications session between theconference bridge 20 and theuser device 22 to the speech processing interface 24. Theuser 16 may say “read the message.” The speech processing interface 24 detects the command “read” and provides this information to thecollaboration agent 10. Thecollaboration agent 10 uses text-to-speech processing to read the IM message to theuser 16. Theuser 16 may reply “Send IM. Thanks for the message, I agree.” Thecollaboration agent 10, via the speech processing interface 24, recognizes the command to create and send an IM message, and uses speech-to-text processing to convert the speech “Thanks for the message, I agree” to a textual format. Thecollaboration agent 10 then responds to the IM message with an IM message saying “Thanks for the message, I agree.” - The
collaboration agent 10A can communicate with thecollaboration agent 10B on behalf of theuser 16A as appropriate. For example, thecollaboration agent 10B may initiate a request to open a voice communications session between theconference bridge 20A and the conference bridge 20B to enable theuser 16B to converse with theuser 16A. However, thecollaboration agent 10A may determine that theuser 16A is currently on the phone, and may determine from thecalendar information 34A that theuser 16A is engaged in a meeting until 3:00 PM, but is available at 3:00 PM for a conversation with theuser 16B. Thecollaboration agent 10A can send a message to theuser 16B that theuser 16A is not available, but that theuser 16A can speak with theuser 16B at 3:00 PM. Thecollaboration agent 10B can determine if theuser 16B is available at 3:00 PM and, if so, can confirm a meeting with thecollaboration agent 10A, and eachcollaboration agent collaboration agent 10 may have rules, or filters, than can affect how thecollaboration agent 10 manages or facilitates a real-time communication based on a particular criterion. For example, assume in the previous example theuser 16B is the manager of theuser 16A. Theuser 16A may set up a rule stating that any attempts from theuser 16B to contact theuser 16A are to be communicated to theuser 16A if theuser 16A is present at auser device 22. In this example, thecollaboration agent 10A may then “whisper” or communicate to theuser 16A via theconference bridge 20A that theuser 16B is attempting to contact theuser 16A. The whisper can not be heard by the other participants of the call in which theuser 16A is currently engaged. Theuser 16A may determine that his presence is not needed on the call, terminate his participation in the call, and indicate to thecollaboration agent 10A to establish the communications session between theconference bridge 20A and the conference bridge 20B to enable theuser 16B to communicate with theuser 16A. - The
collaboration agent 10 is preferably always executing on behalf of theuser 16. Thus, thecollaboration agent 10, even during non-work hours, can make decisions on behalf of theuser 16 to either arrange meetings in the future, or attempt to contact theuser 16 based on criteria associated with a respective real-time communication. -
FIG. 4 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to one embodiment of the invention.FIG. 4 will be discussed in conjunction withFIG. 3 . Assume that theuser 16A wishes to communicate with theuser 16B. Theuser 16A utters a command or a hot word into theuser device 22A, which is coupled to theconference bridge 20A via afirst communications session 36A. Preferably, according to one embodiment of the present invention, theuser 16A remains coupled to theconference bridge 20A via thefirst communications session 36A during normal business hours so that thecollaboration agent 10A can communicate to theuser 16A via theuser device 22A based on the receipt of real-time communications. For example, when theuser 16A arrives at the office each morning, he uses a work telephone to dial into a telephone number associated with arespective collaboration agent 10A to establish a communications session between the work telephone and thecollaboration agent 10A. Theuser 16A leaves the work telephone connected to thecollaboration agent 10A throughout the day, and enables a speaker phone capability and hands-free speaking capability of the work telephone, so theuser 16A can communicate with thecollaboration agent 10A by merely speaking in proximity of the telephone, and similarly can hear information provided by thecollaboration agent 10A when in proximity of the work telephone. - The
collaboration agent 10A recognizes a command from theuser 16A and prompts theuser 16A for a particular command. Theuser 16A indicates a desire to contact the user 166 by identifying theuser 16B by name (step 100). Thecollaboration agent 10A searches a contacts list associated with theuser 16A for the name of theuser 16B (step 102). If thecollaboration agent 10A cannot locate the name of theuser 16B in the contacts list (step 104), thecollaboration agent 10A may refer to an enterprise-wide directory, such as a Lightweight Directory Access Protocol (LDAP) directory (step 106). Assume that the collaboration-agent 10A finds multiple names that are similar or identical to the name of theuser 16B (step 108). Thecollaboration agent 10A can provide, via thefirst communications session 36A; a message to theuser 16A requesting clarification, such as, for example, a last name associated with theuser 16B, to remove the ambiguity (step 110). Theuser 16A can indicate the complete name of the user 166, removing the ambiguity. Thecollaboration agent 10A can pull, or otherwise extract, a collaboration agent profile associated with thecollaboration agent 10B from a database configured to store public profiles of users (step 112). Thecollaboration agent 10A can obtain an address of thecollaboration agent 10B by, for example, by using the identity of theuser 16B in a directory lookup as is known by those skilled in the art. Thecollaboration agent 10A issues a QueryAvailability request to thecollaboration agent 10B (step 114). - The
collaboration agent 10A receives a request from thecollaboration agent 10B that includes one or more ofpresence information 30B,location information 32B, orcalendar information 34B (step 116). If, from the provided information, thecollaboration agent 10A determines that theuser 16B is available for communication, thecollaboration agent 10A can issue an AuthorizeCommunication request to thecollaboration agent 10B (step 118). Thecollaboration agent 10B may review one or more rules associated with theuser 16A and, for example, contact theuser 16B to determine whether theuser 16B desires to communicate with theuser 16A. Assume that thecollaboration agent 10B has determined that theuser 16B desires to speak to theuser 16A. Therefore, thecollaboration agent 10B sends a response indicating an approval to talk with theuser 16A indicating that a voice channel is a desired communications mechanism (step 120). Thecollaboration agents communications session 36C between theconference bridge 20A and the conference bridge 20B enabling theuser 16A to communicate via the conference bridges 20A, 20B andcommunications sessions -
FIG. 5 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to another embodiment of the present invention. In this embodiment, assume theuser 16A desires to communicate with theuser 16B, but that theuser 16B, unbeknownst to theuser 16A, is traveling.Steps 200 through 208 may be identical tosteps 100 through 116 discussed with reference toFIG. 4 , and will not be repeated herein. However, in this embodiment the response from thecollaboration agent 10B indicates that theuser 16B is out of the office and traveling, but that a DoNotDisturb flag is reset, indicating that theuser 16B may be contacted, if possible (step 210). Thecollaboration agent 10A issues an AuthorizeCommunication request to thecollaboration agent 10B (step 212). Thecollaboration agent 10B, based on thecontextual information 18B, determines that theuser 16B is out of the office and traveling, and attempts to contact theuser 16B via one ormore user devices 22B associated with theuser 16B (step 214). Notably, thecollaboration agent 10B attempts to contact theuser 16B without providing personal information, such as cell phone numbers associated with theuser 16B, to thecollaboration agent 10A. - Assume that the
collaboration agent 10B is able to establish acommunications session 36B between the conference bridge 20B and auser device 22B associated with theuser 16B. Thecollaboration agent 10B informs thecollaboration agent 10A to establish acommunications session 36C between theconference bridge 20A and the conference bridge 20B, enabling theuser 16A to converse with theuser 16B (step 216). Alternately, if theuser 16B could not be reached by thecollaboration agent 10B, thecollaboration agent 10B can inform thecollaboration agent 10A that theuser 16B is currently unavailable (step 218). At this point, thecollaboration agents collaboration agents users users collaboration agent 10B to notify thecollaboration agent 10A when thecontextual information 18B associated with theuser 16B indicates that theuser 16B is present and available. Upon notification, thecollaboration agent 10A can inform theuser 16A that theuser 16B appears to be present and available, enabling theuser 16A to again attempt to contact theuser 16B. The LeaveVoiceMessage request can be used to enable theuser 16A to, leave a voice mail message for theuser 16B. -
FIG. 6 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the present invention. Thecollaboration agent 10A determines that theuser 16A has an imminent meeting based oncontextual information 18A obtained from thecalendar information 34A (step 300). Thecollaboration agent 10A determines that theuser 16A is in the midst of a communication with anotheruser 16. Thecollaboration agent 10A whispers a meeting reminder via thefirst communications session 36A to theuser 16A via theuser device 22A that theuser 16A has an imminent meeting (step 302). By “whisper,” it is meant that theuser 16A can hear the reminder from thecollaboration agent 10A but no other participants in the call with theuser 16A can hear the reminder. - The
collaboration agent 10A then receives a request from thecollaboration agent 10B to establish a communications session between theconference bridge 20A and the conference bridge 206 (step 304). Thecollaboration agent 10A obtains meeting information from thecalendar information 34A (step 306). Thecollaboration agent 10A determines that theuser 16B is an attendee of the meeting that is imminent (step 308). Thecollaboration agent 10A establishes a communications session between theconference bridge 20A and the conference bridge 20B enabling theuser 16B to participate in the meeting at the time of the meeting (step 310). Thecollaboration agent 10A also whispers to theuser 16A via thefirst communications session 36A that theuser 16B has joined theconference bridge 20A (step 312). -
FIG. 7 is a flowchart illustrating a collaboration agent facilitating a real-time communication on behalf of a user according to yet another embodiment of the present invention. Thecollaboration agent 10A receives a request from thecollaboration agent 10B to establish a communications session between theconference bridge 20A and the conference bridge 20B to enable theuser 16B to converse with theuser 16A (step 400). Thecollaboration agent 10A determines, based oncontextual information 18A, that theuser 16A is available for communication with theuser 16B (step 402). However, thecollaboration agent 10A determines that a rule exists that requires actual confirmation from theuser 16A prior to establishing a communications session to enable conversations with theuser 16B. Rules can be maintained in a storage device or memory accessible to therespective collaboration agent 10. Rules can be set up by arespective user 16 based on a variety of criteria such as calling party, time of day, and the like. Thecollaboration agent 10A identifies theuser 16B to theuser 16A via thecommunications session 36A (step 404). - The
user 16A initiates a command to thecollaboration agent 10A requesting additional information, such as the nature of the call, from theuser 16B (step 406). Thecollaboration agent 10A establishes acommunications session 36C between theconference bridge 20A and the conference bridge 20B (step 408) and provides the question from theuser 16A to theuser 16B via thecommunications session 36C (step 410). Theuser 16B responds with a reply to the request, and thecollaboration agent 10A provides the audio signals from theuser 16B to thefirst communications session 36A so that theuser 16A can hear the response from theuser 16B (step 412). Theuser 16A indicates approval to thecollaboration agent 10A for a conversation with theuser 16B (step 414). Thecollaboration agent 10A enables communications between thecommunications session 36A and thecommunications session 36C to enable theuser 16A to converse with theuser 16B (step 416). - According to another embodiment of the present invention, the
collaboration agent 10A enables theuser 16A to monitor a plurality of conference calls simultaneously. For purposes of illustration, assume that theuser 16A is attending a first conference call via theconference bridge 20A. Assume that the first conference call is scheduled from 9:00 am to 11:00 am. Assume that theuser 16A also has a second conference call that he is expected to attend that is scheduled from 10:00 am to 11:00 am. At 10:00 am, thecollaboration agent 10A initiates a communications session with theconference bridge 20 that is hosting the second conference call. Thecollaboration agent 10A informs theuser 16A that he has been joined to the second conference call, and mutes the outgoing audio associated with the first conference call so that attendees of the first conference call cannot hear theuser 16A, and enables bi-directional audio on the second conference call so that theuser 16A can indicate his presence to the attendees of the second conference call. Theuser 16A can then issue a command to thecollaboration agent 10A to monitor the second conference call, and thecollaboration agent 10A can enable bi-directional communications again for the first conference call, and inhibit outgoing audio for the second conference call so that attendees of the second conference call cannot hear theuser 16A or the first conference call. Theuser 16A can now participate fully in the first conference call, while simultaneously monitoring the second conference call. Assume further that theuser 16A determines that he must respond to a remark made on the second conference call. Theuser 16A issues a command to thecollaboration agent 10A, and in response, thecollaboration agent 10A inhibits the outgoing audio associated with the first conference call, and enables bi-directional audio with the second conference call. Theuser 16A speaks to the attendees of the second conference call, and the attendees of the first conference call cannot hear theuser 16A, although theuser 16A can continue to hear the audio associated with the first conference call simultaneously while theuser 16A fully participates in the second conference call. - If desired, the
collaboration agent 10A can issue a prerecorded message to either conference call as theuser 16A moves from active to passive participation. For example, when theuser 16A indicates he desires to speak to the attendees of the second conference call, thecollaboration agent 10A may play a message to the attendees of the first conference call that indicates that theuser 16A is departing the conference call temporarily. When theuser 16A returns to full participation in the first conference call, thecollaboration agent 10A can issue another prerecorded message to the attendees of the first conference call that theuser 16A has returned to the conference call. - Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the present invention. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/277,781 US20120036194A1 (en) | 2008-12-29 | 2011-10-20 | Collaboration agent |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/344,914 US8060563B2 (en) | 2008-12-29 | 2008-12-29 | Collaboration agent |
US13/277,781 US20120036194A1 (en) | 2008-12-29 | 2011-10-20 | Collaboration agent |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/344,914 Continuation US8060563B2 (en) | 2008-12-29 | 2008-12-29 | Collaboration agent |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120036194A1 true US20120036194A1 (en) | 2012-02-09 |
Family
ID=42286207
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/344,914 Expired - Fee Related US8060563B2 (en) | 2008-12-29 | 2008-12-29 | Collaboration agent |
US13/277,781 Abandoned US20120036194A1 (en) | 2008-12-29 | 2011-10-20 | Collaboration agent |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/344,914 Expired - Fee Related US8060563B2 (en) | 2008-12-29 | 2008-12-29 | Collaboration agent |
Country Status (8)
Country | Link |
---|---|
US (2) | US8060563B2 (en) |
EP (1) | EP2382745A4 (en) |
JP (2) | JP2012514367A (en) |
KR (1) | KR20110100244A (en) |
BR (1) | BRPI0923823A2 (en) |
CA (1) | CA2745472A1 (en) |
RU (1) | RU2011131697A (en) |
WO (1) | WO2010076629A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100296417A1 (en) * | 2009-05-20 | 2010-11-25 | Avaya Inc. | Grid-based contact center |
US20140093064A1 (en) * | 2011-05-24 | 2014-04-03 | Nec Corporation | Communication processing system, communication processing method, communication processing device, and control method and control program of communication processing device |
US20140222907A1 (en) * | 2013-02-01 | 2014-08-07 | Avaya Inc. | System and method for context-aware participant management |
US11551689B2 (en) | 2020-09-30 | 2023-01-10 | International Business Machines Corporation | Voice command execution |
Families Citing this family (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8060563B2 (en) * | 2008-12-29 | 2011-11-15 | Nortel Networks Limited | Collaboration agent |
US8311203B2 (en) * | 2009-03-24 | 2012-11-13 | T-Mobile Usa, Inc. | User-initiated return communication |
US20100246791A1 (en) * | 2009-03-24 | 2010-09-30 | T-Mobile Usa, Inc. | Calendar-based return communication |
US8340631B2 (en) * | 2009-03-24 | 2012-12-25 | T-Mobile Usa, Inc. | Deferred communication and relationship management |
US9325599B2 (en) | 2009-03-30 | 2016-04-26 | Shoretel, Inc. | Methods for providing a status to devices in a distributed system |
US8493892B1 (en) * | 2009-03-30 | 2013-07-23 | Shoretel, Inc. | Resolving conflicts in distributed systems |
US8700665B2 (en) * | 2009-04-27 | 2014-04-15 | Avaya Inc. | Intelligent conference call information agents |
US20110077947A1 (en) * | 2009-09-30 | 2011-03-31 | Avaya, Inc. | Conference bridge software agents |
US8774787B2 (en) * | 2009-12-01 | 2014-07-08 | At&T Intellectual Property I, L.P. | Methods and systems for providing location-sensitive conference calling |
US8621005B2 (en) * | 2010-04-28 | 2013-12-31 | Ttb Technologies, Llc | Computer-based methods and systems for arranging meetings between users and methods and systems for verifying background information of users |
US8705410B2 (en) * | 2010-09-30 | 2014-04-22 | Avaya Inc. | Global conference roster for distributed bridges |
US20120130766A1 (en) * | 2010-11-24 | 2012-05-24 | International Business Machines Corporation | Device-independent attendance prompting tool for electronically-scheduled events |
US20130090973A1 (en) * | 2011-10-07 | 2013-04-11 | Shiu Hui | Enterprise Interaction Management Methods and Apparatus |
US20130145293A1 (en) * | 2011-12-01 | 2013-06-06 | Avaya Inc. | Methods, apparatuses, and computer-readable media for providing availability metaphor(s) representing communications availability in an interactive map |
US9178918B2 (en) | 2011-12-21 | 2015-11-03 | Level 3 Communications, Llc | Method for routing in a central conferencing routing server |
US9716860B2 (en) * | 2011-12-21 | 2017-07-25 | Level 3 Communications, Llc | Collaboration conference linking in a telecommunications network |
US10122771B2 (en) * | 2011-12-21 | 2018-11-06 | Level 3 Communications, Llc | Routing of conference participant based on caller recognition |
US20130179455A1 (en) * | 2012-01-09 | 2013-07-11 | International Business Machines Corporation | Collaboration data organizer |
US9699256B2 (en) * | 2012-09-28 | 2017-07-04 | Avaya Inc. | System and method for dynamic suggestion of optimal course of action |
US9432517B2 (en) | 2013-02-07 | 2016-08-30 | Avaya Inc. | Methods, apparatuses, and systems for generating an action item in response to a detected audio trigger during a conversation |
KR101437329B1 (en) * | 2013-04-03 | 2014-09-11 | 에스케이텔레콤 주식회사 | Method for collaborative context-awareness and apparatus for the same |
US20150092615A1 (en) * | 2013-10-02 | 2015-04-02 | David Paul Frankel | Teleconference system with overlay aufio method associate thereto |
US9591140B1 (en) * | 2014-03-27 | 2017-03-07 | Amazon Technologies, Inc. | Automatic conference call connection |
WO2016133319A1 (en) * | 2015-02-16 | 2016-08-25 | Samsung Electronics Co., Ltd. | Method and device for providing information |
US10235129B1 (en) | 2015-06-29 | 2019-03-19 | Amazon Technologies, Inc. | Joining users to communications via voice commands |
US11120342B2 (en) | 2015-11-10 | 2021-09-14 | Ricoh Company, Ltd. | Electronic meeting intelligence |
US20180097858A1 (en) * | 2016-10-04 | 2018-04-05 | International Business Machines Corporation | Embedded side call sub-channel used in a telecommunication session |
US10860985B2 (en) | 2016-10-11 | 2020-12-08 | Ricoh Company, Ltd. | Post-meeting processing using artificial intelligence |
US11307735B2 (en) | 2016-10-11 | 2022-04-19 | Ricoh Company, Ltd. | Creating agendas for electronic meetings using artificial intelligence |
US10510051B2 (en) | 2016-10-11 | 2019-12-17 | Ricoh Company, Ltd. | Real-time (intra-meeting) processing using artificial intelligence |
US10572858B2 (en) | 2016-10-11 | 2020-02-25 | Ricoh Company, Ltd. | Managing electronic meetings using artificial intelligence and meeting rules templates |
US10796697B2 (en) * | 2017-01-31 | 2020-10-06 | Microsoft Technology Licensing, Llc | Associating meetings with projects using characteristic keywords |
US11062271B2 (en) | 2017-10-09 | 2021-07-13 | Ricoh Company, Ltd. | Interactive whiteboard appliances with learning capabilities |
US10956875B2 (en) | 2017-10-09 | 2021-03-23 | Ricoh Company, Ltd. | Attendance tracking, presentation files, meeting services and agenda extraction for interactive whiteboard appliances |
US10553208B2 (en) | 2017-10-09 | 2020-02-04 | Ricoh Company, Ltd. | Speech-to-text conversion for interactive whiteboard appliances using multiple services |
US10552546B2 (en) | 2017-10-09 | 2020-02-04 | Ricoh Company, Ltd. | Speech-to-text conversion for interactive whiteboard appliances in multi-language electronic meetings |
US11030585B2 (en) | 2017-10-09 | 2021-06-08 | Ricoh Company, Ltd. | Person detection, person identification and meeting start for interactive whiteboard appliances |
US10757148B2 (en) | 2018-03-02 | 2020-08-25 | Ricoh Company, Ltd. | Conducting electronic meetings over computer networks using interactive whiteboard appliances and mobile devices |
US11720741B2 (en) | 2019-03-15 | 2023-08-08 | Ricoh Company, Ltd. | Artificial intelligence assisted review of electronic documents |
US11080466B2 (en) | 2019-03-15 | 2021-08-03 | Ricoh Company, Ltd. | Updating existing content suggestion to include suggestions from recorded media using artificial intelligence |
US11392754B2 (en) | 2019-03-15 | 2022-07-19 | Ricoh Company, Ltd. | Artificial intelligence assisted review of physical documents |
US11270060B2 (en) | 2019-03-15 | 2022-03-08 | Ricoh Company, Ltd. | Generating suggested document edits from recorded media using artificial intelligence |
US11573993B2 (en) | 2019-03-15 | 2023-02-07 | Ricoh Company, Ltd. | Generating a meeting review document that includes links to the one or more documents reviewed |
US11263384B2 (en) | 2019-03-15 | 2022-03-01 | Ricoh Company, Ltd. | Generating document edit requests for electronic documents managed by a third-party document management service using artificial intelligence |
Citations (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060075091A1 (en) * | 2004-09-30 | 2006-04-06 | Siemens Information And Communication Networks, Inc. | System and method for historical presence map |
US20060123082A1 (en) * | 2004-12-03 | 2006-06-08 | Digate Charles J | System and method of initiating an on-line meeting or teleconference via a web page link or a third party application |
US20060288099A1 (en) * | 2005-05-06 | 2006-12-21 | Iotum Corporation, A Delaware Corporation | Method of and System for Presence Management in Telecommunications |
US20070081644A1 (en) * | 2005-09-23 | 2007-04-12 | Jack Jachner | Telephony/conference activity presence state |
US20070116225A1 (en) * | 2005-10-27 | 2007-05-24 | Wei Zhao | Systems and methods for efficient hybrid conferencing |
US20070165640A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | System and method for dynamically re-directing communications sessions based on location-enhanced information |
US20070167170A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | Method and device for determining location-enhanced presence information for entities subscribed to a communications system |
US20070165641A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | System and method for dynamically re-configuring communications session routing based on location information |
US20070165554A1 (en) * | 2004-12-23 | 2007-07-19 | Agovo Communications Inc. | System, Method and Portable Communication Device |
US20070253424A1 (en) * | 2006-05-01 | 2007-11-01 | Herot Christopher F | Web-based system and method of establishing an on-line meeting or teleconference |
US20080005235A1 (en) * | 2006-06-30 | 2008-01-03 | Microsoft Corporation | Collaborative integrated development environment using presence information |
US20080037746A1 (en) * | 2006-06-29 | 2008-02-14 | Nortel Networks Limited | Method and system for automatic call redialing |
US20080037446A1 (en) * | 2006-08-08 | 2008-02-14 | Cisco Technology, Inc. | Facilitating connection to a conference call |
US20080072158A1 (en) * | 2006-09-15 | 2008-03-20 | Antonio Samele | User collaboration system |
US20080151785A1 (en) * | 2006-12-22 | 2008-06-26 | Nortel Networks Limited | Personalized conference bridge |
US20080159179A1 (en) * | 2007-01-03 | 2008-07-03 | Cisco Technology, Inc. | Scalable conference bridge |
US20080159490A1 (en) * | 2007-01-03 | 2008-07-03 | Alcatel Lucent | System and method for controlling access to conference calls |
US20080253546A1 (en) * | 2007-04-13 | 2008-10-16 | Li Chen | Telephone Conference Call Management |
US20080300852A1 (en) * | 2007-05-30 | 2008-12-04 | David Johnson | Multi-Lingual Conference Call |
US20080299954A1 (en) * | 2007-03-02 | 2008-12-04 | Aegis Mobility, Inc. | Management of mobile device communication sessions to reduce user distraction |
US20080310607A1 (en) * | 2007-06-17 | 2008-12-18 | Alcatel Lucent | Presence Based DTMF Signaling Enablement of Voice Communication Controller and Method |
US20090005038A1 (en) * | 2007-06-26 | 2009-01-01 | At&T Knowledge Ventures, Lp | Techniques for conference scheduling |
US20090019367A1 (en) * | 2006-05-12 | 2009-01-15 | Convenos, Llc | Apparatus, system, method, and computer program product for collaboration via one or more networks |
US20090054107A1 (en) * | 2007-08-20 | 2009-02-26 | Synaptics Incorporated | Handheld communication device and method for conference call initiation |
US20090210802A1 (en) * | 2008-02-19 | 2009-08-20 | Microsoft Corporation | Location information in presence |
US20100022225A1 (en) * | 2006-10-29 | 2010-01-28 | Neatcall Ltd. | Methods and systems for setting, scheduling, optimizing, and initiating personal communication and prioritizing communication channels and devices |
US20100121666A1 (en) * | 2008-11-12 | 2010-05-13 | Oracle International Corporation | Management and automatic invocation of scheduled collaboration events |
US20100153497A1 (en) * | 2008-12-12 | 2010-06-17 | Nortel Networks Limited | Sharing expression information among conference participants |
US20100158220A1 (en) * | 2008-12-23 | 2010-06-24 | At&T Mobility Ii Llc | Calendar-callback voicemail |
US20100165889A1 (en) * | 2008-12-29 | 2010-07-01 | Pramod Madabhushi | Distributed audio conferencing architecture with optimum resource utilization and seamless scalability |
US20100169418A1 (en) * | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Collaboration agent |
US20100260074A1 (en) * | 2009-04-09 | 2010-10-14 | Nortel Networks Limited | Enhanced communication bridge |
US20100325214A1 (en) * | 2009-06-18 | 2010-12-23 | Microsoft Corporation | Predictive Collaboration |
US20110075826A1 (en) * | 2009-09-30 | 2011-03-31 | Avaya, Inc. | Assignment of full enterprise identity to audio conference bridges for improved conference scheduling and call-in experience |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5652789A (en) * | 1994-09-30 | 1997-07-29 | Wildfire Communications, Inc. | Network based knowledgeable assistant |
US5872841A (en) * | 1996-11-14 | 1999-02-16 | Siemens Information And Comunication Newtworks, Inc. | Apparatus and method for scheduling a telephone call |
US6870916B2 (en) * | 2001-09-14 | 2005-03-22 | Lucent Technologies Inc. | Targeted and intelligent multimedia conference establishment services |
JP2003204395A (en) * | 2001-10-30 | 2003-07-18 | Fuji Xerox Co Ltd | Method, system, and program for future communication negotiation and for postponement of communication |
JP3920175B2 (en) * | 2002-08-29 | 2007-05-30 | 株式会社国際電気通信基礎技術研究所 | Call activation system |
GB0513167D0 (en) | 2005-06-28 | 2005-08-03 | Mobestar Ltd | Communications system for anonymous communications |
WO2008009090A1 (en) | 2006-07-21 | 2008-01-24 | Bce Inc | Method, system and apparatus for handling establishment of a communication session |
US20080259824A1 (en) | 2007-04-23 | 2008-10-23 | Frankel David P | Identity-based conferencing systems and methods |
-
2008
- 2008-12-29 US US12/344,914 patent/US8060563B2/en not_active Expired - Fee Related
-
2009
- 2009-12-22 KR KR1020117015076A patent/KR20110100244A/en not_active Withdrawn
- 2009-12-22 EP EP09836145.4A patent/EP2382745A4/en not_active Withdrawn
- 2009-12-22 CA CA2745472A patent/CA2745472A1/en not_active Abandoned
- 2009-12-22 RU RU2011131697/08A patent/RU2011131697A/en unknown
- 2009-12-22 BR BRPI0923823-9A patent/BRPI0923823A2/en not_active IP Right Cessation
- 2009-12-22 JP JP2011542913A patent/JP2012514367A/en active Pending
- 2009-12-22 WO PCT/IB2009/007862 patent/WO2010076629A1/en active Application Filing
-
2011
- 2011-10-20 US US13/277,781 patent/US20120036194A1/en not_active Abandoned
-
2014
- 2014-09-24 JP JP2014194508A patent/JP2015043583A/en active Pending
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060075091A1 (en) * | 2004-09-30 | 2006-04-06 | Siemens Information And Communication Networks, Inc. | System and method for historical presence map |
US20060123082A1 (en) * | 2004-12-03 | 2006-06-08 | Digate Charles J | System and method of initiating an on-line meeting or teleconference via a web page link or a third party application |
US20070165554A1 (en) * | 2004-12-23 | 2007-07-19 | Agovo Communications Inc. | System, Method and Portable Communication Device |
US20100020728A1 (en) * | 2004-12-23 | 2010-01-28 | Todd Jefferson | System, Method and Portable Communication Device |
US20060288099A1 (en) * | 2005-05-06 | 2006-12-21 | Iotum Corporation, A Delaware Corporation | Method of and System for Presence Management in Telecommunications |
US20070081644A1 (en) * | 2005-09-23 | 2007-04-12 | Jack Jachner | Telephony/conference activity presence state |
US20070116225A1 (en) * | 2005-10-27 | 2007-05-24 | Wei Zhao | Systems and methods for efficient hybrid conferencing |
US20070165641A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | System and method for dynamically re-configuring communications session routing based on location information |
US20070167170A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | Method and device for determining location-enhanced presence information for entities subscribed to a communications system |
US20070165640A1 (en) * | 2006-01-18 | 2007-07-19 | Nortel Networks Limited | System and method for dynamically re-directing communications sessions based on location-enhanced information |
US20070253424A1 (en) * | 2006-05-01 | 2007-11-01 | Herot Christopher F | Web-based system and method of establishing an on-line meeting or teleconference |
US20090019367A1 (en) * | 2006-05-12 | 2009-01-15 | Convenos, Llc | Apparatus, system, method, and computer program product for collaboration via one or more networks |
US20080037746A1 (en) * | 2006-06-29 | 2008-02-14 | Nortel Networks Limited | Method and system for automatic call redialing |
US20080005235A1 (en) * | 2006-06-30 | 2008-01-03 | Microsoft Corporation | Collaborative integrated development environment using presence information |
US20080037446A1 (en) * | 2006-08-08 | 2008-02-14 | Cisco Technology, Inc. | Facilitating connection to a conference call |
US20080072158A1 (en) * | 2006-09-15 | 2008-03-20 | Antonio Samele | User collaboration system |
US20100022225A1 (en) * | 2006-10-29 | 2010-01-28 | Neatcall Ltd. | Methods and systems for setting, scheduling, optimizing, and initiating personal communication and prioritizing communication channels and devices |
US20080151785A1 (en) * | 2006-12-22 | 2008-06-26 | Nortel Networks Limited | Personalized conference bridge |
US20080159490A1 (en) * | 2007-01-03 | 2008-07-03 | Alcatel Lucent | System and method for controlling access to conference calls |
US20080159179A1 (en) * | 2007-01-03 | 2008-07-03 | Cisco Technology, Inc. | Scalable conference bridge |
US20080299954A1 (en) * | 2007-03-02 | 2008-12-04 | Aegis Mobility, Inc. | Management of mobile device communication sessions to reduce user distraction |
US20080253546A1 (en) * | 2007-04-13 | 2008-10-16 | Li Chen | Telephone Conference Call Management |
US20080300852A1 (en) * | 2007-05-30 | 2008-12-04 | David Johnson | Multi-Lingual Conference Call |
US20080310607A1 (en) * | 2007-06-17 | 2008-12-18 | Alcatel Lucent | Presence Based DTMF Signaling Enablement of Voice Communication Controller and Method |
US20090005038A1 (en) * | 2007-06-26 | 2009-01-01 | At&T Knowledge Ventures, Lp | Techniques for conference scheduling |
US20090054107A1 (en) * | 2007-08-20 | 2009-02-26 | Synaptics Incorporated | Handheld communication device and method for conference call initiation |
US20090210802A1 (en) * | 2008-02-19 | 2009-08-20 | Microsoft Corporation | Location information in presence |
US20100121666A1 (en) * | 2008-11-12 | 2010-05-13 | Oracle International Corporation | Management and automatic invocation of scheduled collaboration events |
US20100153497A1 (en) * | 2008-12-12 | 2010-06-17 | Nortel Networks Limited | Sharing expression information among conference participants |
US20100158220A1 (en) * | 2008-12-23 | 2010-06-24 | At&T Mobility Ii Llc | Calendar-callback voicemail |
US20100165889A1 (en) * | 2008-12-29 | 2010-07-01 | Pramod Madabhushi | Distributed audio conferencing architecture with optimum resource utilization and seamless scalability |
US20100169418A1 (en) * | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Collaboration agent |
US20100260074A1 (en) * | 2009-04-09 | 2010-10-14 | Nortel Networks Limited | Enhanced communication bridge |
US20100325214A1 (en) * | 2009-06-18 | 2010-12-23 | Microsoft Corporation | Predictive Collaboration |
US20110075826A1 (en) * | 2009-09-30 | 2011-03-31 | Avaya, Inc. | Assignment of full enterprise identity to audio conference bridges for improved conference scheduling and call-in experience |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100296417A1 (en) * | 2009-05-20 | 2010-11-25 | Avaya Inc. | Grid-based contact center |
US8964958B2 (en) * | 2009-05-20 | 2015-02-24 | Avaya Inc. | Grid-based contact center |
US20140093064A1 (en) * | 2011-05-24 | 2014-04-03 | Nec Corporation | Communication processing system, communication processing method, communication processing device, and control method and control program of communication processing device |
US20140222907A1 (en) * | 2013-02-01 | 2014-08-07 | Avaya Inc. | System and method for context-aware participant management |
US9756083B2 (en) * | 2013-02-01 | 2017-09-05 | Avaya Inc. | System and method for context-aware participant management |
US11551689B2 (en) | 2020-09-30 | 2023-01-10 | International Business Machines Corporation | Voice command execution |
Also Published As
Publication number | Publication date |
---|---|
EP2382745A4 (en) | 2014-09-10 |
BRPI0923823A2 (en) | 2015-07-14 |
US20100169418A1 (en) | 2010-07-01 |
JP2012514367A (en) | 2012-06-21 |
CA2745472A1 (en) | 2010-07-08 |
US8060563B2 (en) | 2011-11-15 |
RU2011131697A (en) | 2013-02-10 |
JP2015043583A (en) | 2015-03-05 |
WO2010076629A1 (en) | 2010-07-08 |
EP2382745A1 (en) | 2011-11-02 |
KR20110100244A (en) | 2011-09-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8060563B2 (en) | Collaboration agent | |
US10289982B2 (en) | Context aware interaction | |
US7277697B2 (en) | Method and system for establishing a teleconference over a telephony network | |
US7542756B2 (en) | Apparatus and method for restoring a conference connection to a cellular telephone | |
US9516167B2 (en) | Media channel management apparatus for network communications sessions | |
US11546467B2 (en) | Method for establishing a telecommunication connection | |
US8369505B2 (en) | Call access management | |
US20100128857A1 (en) | Call forwarding system and method employing virtual phone numbers associated with landline and other discrete telephone units | |
US20080292065A1 (en) | Single Point of Contact Personal Communication System | |
US8027447B2 (en) | Call processing based on electronic calendar information | |
US10686939B1 (en) | Conferencing and meeting implementations with advanced features | |
US20130102293A1 (en) | Method and system to automatically park a voice call for data transfer | |
US7333803B2 (en) | Network support for voice-to-text memo service | |
US20130058473A1 (en) | Digital Network-Based Telephone Systems and Functionality | |
US10277747B2 (en) | Systems and methods for accessing conference calls | |
KR20060088257A (en) | Utility system and operation method providing multi messaging service in communication system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ROCKSTAR CONSORTIUM US LP, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROCKSTAR BIDCO, LP;REEL/FRAME:032436/0804 Effective date: 20120509 |
|
AS | Assignment |
Owner name: RPX CLEARINGHOUSE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROCKSTAR CONSORTIUM US LP;ROCKSTAR CONSORTIUM LLC;BOCKSTAR TECHNOLOGIES LLC;AND OTHERS;REEL/FRAME:034924/0779 Effective date: 20150128 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, IL Free format text: SECURITY AGREEMENT;ASSIGNORS:RPX CORPORATION;RPX CLEARINGHOUSE LLC;REEL/FRAME:038041/0001 Effective date: 20160226 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: RPX CLEARINGHOUSE LLC, CALIFORNIA Free format text: RELEASE (REEL 038041 / FRAME 0001);ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:044970/0030 Effective date: 20171222 Owner name: RPX CORPORATION, CALIFORNIA Free format text: RELEASE (REEL 038041 / FRAME 0001);ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:044970/0030 Effective date: 20171222 |