Dear all
I am study a protyping application for a customer in order to deliver colaborative aspect of the application design.
In order to better cacth the proper foundation to use for that, the scenario is as follow:
- A project manager will connect to a hosted portal on the web where he has setup his presentation.
- Then different employee need to follow that presentation and collaborate in live with it
- The PM start to show on a touch screen for instance a project plan picture and during explaination start to ink and take some note on the plan for better understanding
- Then employee have a Windows tablette in hands and then can see directly on the tablet what the PM is showing on the touch screen ( included added annotation, and update in live)
- Then employee could also place some annotation directly from their tablet, and its refelcted on PM touch screen
What I am trying to reach:
What I am trying to do out of this is how can I get in live what is happening on touch screen to tablet and reverse. How notes taken from tablet could be send out in order to be visible on touch screen ( could be inking, text,..)
And also what technique to be used in order that Ink points, object position, touch input, gets send to others devices .
Do I have to records all touch input and serialize them and rebuild the touch sequence on the other side ?
I have no idea how to approach this and looking for proper way, knowing that it should be fast.
Thanks for your comments
regards