This one – no, and the sensor information is not fused in any way. Actually, I would suggest using quaternion implementation rather than Kalman filter like here http://www.himix.lt/?p=915 (sensor fusion is done here), when you use the array of 3 sensors (accelerometer, gyroscope and magnetometer). The use of Kalman filter would not provide noticeable improvements over quaternions (I’ve did lot’s of experimentation).
BUT! If you use only one sensor, for instance, accelerometer, I would recommend using Kalman filter.
I’m planing to make KF tutorial in near future.
This book goes together with DVD disc in which you find the Augmented Reality software. It might be games, it might be some other exciting things related to Augmented Reality. So this is only DEMO of the book with games that someone else developed.
I want to make final year btech project on agumnet reality so i need such things. You have also mentioned about the ultimate project, what is it ? Can you help me in building a project with me ?
I can tell you that Ultimate project won’t be available for free, it is something that I’ve been working and improving for several years, it’s a combination of Augmented Reality and Arduino.
Tutorials – is the way that I help. You can’t find anything that would fit your idea? What is your idea?
My idea is i want image recogination and text recogination connected with internet. Like if i want to know about a book , i just point my camera on the book cover and it will tell me the reviews. Its basically a part of Sisth sense technology developed by pranav mistry.
Image and text recognition is basically solved in my tutorials, but it is all predifined (images, text). If it is all predifined it would be easier. What worries – the search on the internet. But it might be that you want a little bit different application, for example, take any books cover picture, recognize it properly and make a search on the internet?
Watch closely video, there are two parts, one for arduino and photoresistor, the other one for Processing and Augmented Reality while acquiring photo-resistor information.
Hi! I’m new to AR and Unity but I have been a software developer for over 10 years. Thank you in advanced, I will try your tutorial. It looks like a lot of fun! 🙂
Hi, I haven’t started this tutorial yet. But maybe you have some clothes I could use in the future for this tutorial? I won’t need to search it by myself.
If you ask generally, then of course it is possible. If you ask me will I do it? I will, but I can’t tell you whether it will be available as tutorial or DEMO. In the future we’ll see. 🙂
hello sir, i m a student i want to develop my own project which is interior designing using marker less augmented reality how can i do that can you give me a demo on that topic which i can use as a reference for my project…. please give me a demo on interior designing using augmented reality…
actually i m very confused so it would be really great if you help me out so please let me know how it work through your demo for interior designing augmented reality… i have prepared the marker based interior designing by taking reference of your demos but now i want to do it in markerless sooo please give me one demo on it please
This is where I suggest you start: https://www.youtube.com/watch?v=qfxqfdtxyVA
This is a markerless AR. Just start from adding your interior design content. No need for a seperate tutorial on this.
sketch_aug14b:32: error: ‘ADXL345’ does not name a type
sketch_aug14b.ino: In function ‘void setup()’:
sketch_aug14b:40: error: ‘adxl’ was not declared in this scope
sketch_aug14b.ino: In function ‘void loop()’:
sketch_aug14b:46: error: ‘adxl’ was not declared in this scope
pls any one say how to solve this error…?
i just download the library and paste it …still im getting this error… what i have to do…
reply as soon as possiable
Are you sure you copied library in the right direction? Could you paste me a path to this library?
After you copied library did you restart the Arduino itself?
suppose i want to take run time image directly to my application and user can place wherever he want how can i do tht ?? which platform will be suitable for my project?? suppose i want to take image from a online shopping site as input to my application and as output user will see how that interior look like..
Just to make it clear, you’re talking about the app using META glasses, right? By saying “runtime image directly” you mean that take a snapshot from camera with augmented content, and place this taken picture in any place you want in augmented reality?
Hello very good job I have this
Error DllNotFoundException : MetaVisionDLL
Meta.CanvasTracker.Release ( )
Meta.CanvasTracker.OnDestroy ( )
Would know how to fix it . Thank you
Hello, I was wondering if I could use this library on a 2-axis accelerometer. I will download the library now and see if you utilize function overloading so that I can pass in only values for the x and y axis; but if not, do you have any ideas? I have a 2-axis accelerometer that is hooked up for I2C ONLY. Please let me know if you have any suggestions or advice. Thank you. -Joe
ADXL345 is a 3-axis accelerometer and using the library provided, you should not have problems in acquiring that information. I2C works perfectly for that.
Instead of making confusion with words this is what i want to create an android application for interior designing with augmented reality…. https://www.youtube.com/watch?v=ipkz6y9mfvk
…. i want to create an android application in which the user can buy the furniture from the online shopping sites and by using my app user will be able to see the augmented view of those furniture at their place and if they like it than they can purchase it… i hope this time i m clear what i want to say…
hmm ya i know its not so simple n so easy.. but i want to do it.. not for a particular output but for sake of knowledge please can you guide me for this?? i will do my best.. just guide me i will work hard on it… please..
okay thanks… but i have to submit it as my college project so let me know from where should i start??? which platform will be suitable for my project and what should i do first?? just let me know that means i will be able to start my work …
CAN you give me a way to do this: after track money paper and get alot of papers , when I zoom in the virtual papers I want to replace it with another image
…
thank u
Hi, greetings from Rio de Janeiro, Brazil! First, I wanna thank you for all your tutorials, I am trying to learn more about Unity3D since I started to watch your videos. But I have this question: If I want to use my smartphone as a stereo glasses with augmented reality, does Vuforia generate an output app for this? Thanks once again and I hope your ideas help to transform our world into a better place!
Hello, Ricardo. Thank you for kind words.
Actually I don’t know the answer, I haven’t done anything alike so far. I mean I didn’t tried to use smartphone as AR glasses. But if you find some useful info on internet later on while researching, please let me know, I’m interested in everything related to AR.
Yeah, it’s not in the gallery, but it’s somewhere in your device’s memory. If you know how to modify this code in order to send the pictures to the gallery, please, let us all know! So far I achieved this (saving pictures to gallery) only by using Unity3D assets/plugins, which comes with a price.
Hi. I have downloaded Processing 3. When I run simpleLite the following error is shown in the console. “No library found for processing.video
Libraries must be installed in a folder named ‘libraries’ inside the ‘sketchbook’ folder.”
Any Solution
Hi,
I don’t have any camera on my computer, athough I would like to use the camera of my mobilephone, is it possible to make the program search for IP address of a camera?
Cheers
i downloaded your project from this site
only one object is view at a time i want to run as in your video
i run app in android mobile any setting require to run app in multi object mode
I would suggest to start recognizing Kuriboh card firstly, after that blue eyes white dragon. Actually, Kuriboh card has not so many good tracking features. Before using the app on android did you test in Unity3D play mode? I would suggest you do it so and check it out how well multitracking is accomplished.
Thanks! Actually I don’t plan to make a tutorial on google glass as I don’t have them. I have AR META glasses as I’ve seen more potential in it. Who knows what future will bring 🙂
hello, i been trying to making Cylinder Target based on your tutorial
and while i’m trying upload the SIDE image on vuforia target manager, it failed saying the image Euforia of Beauty Logo dont match the dimension, cant you post the tutorial how to measure the image so it can be uploaded on target manager
I’ve added rescaled image (Download # Print Euforia of Beauty Logo to Augment the Content and Create the Tracker for Cylindrical Object (*.jpg file)), you can try to upload once again.
hello, i tried to make rotation button on adroid
but its always looping
the only way to stop it, is hold the button
but if i release the button, its looping again
Thanks a ton for all these knowledge sharing and I have become a serious follower of your tutorials and has also subscribed your youtube channel. These tutorials are great assets for people like me who are getting in to AR field.
Hi I am totally new with AR and any software developer ( not a programmer at all), Thank you, I test it out and it works. But I have a question, If I want to use my own 360 picture, how can I upload it and use it?
Hi, Thank you for the tutorial, it is a really good kick start for someone who is totally new and want to learn like me. However, I have a question, after doing everything you did in the video, when pressing play, how do I like it to my android? Are there any videos you did which explains?
Hi,
Thank you for the tutorial video, it is a really kick start for someone totally new like me.
It would be great if you can help me with my question.
After doing everything you did in the video with a PC, and press play button, how do I link the program to my android device?
I have been trying to run this code for hours but for some reason the Arduino 1de ver 1.6.5 on windows 10 cannot find HMC5883L.h. I’ve placed copies of the library in the main library, the sketch book library and the hardware library but it continues not to find it. I also used the library manager. help
Hi, I have windows 10 x64, I tested it right now and it works perfectly. Make sure that the library directories looks something similar to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_Example
not to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_libraryHMC5883L_Example
Depending on how you extract the libraries, it might have two same folders one in another (HMC5883L_libraryHMC5883L_library\HMC5883L_Example) so I assume Arduino can’t recognize it.
Hi, I don’t know why but arduino software don´t recognize the ADXL345 library. The program shows it in black instead of orange.
The folder is in this directory: C:UsersLeyreDocumentsArduinolibrariesADXL345_library with the other libraries (which it recognizes well) and the folder ADXL345_library is not duplied.
Nevertheless, when compiling the software doesn´t show any error.
Any idea? I need help please.
First you said it does not recognize the library, but then you say that it “doesn’t show any error”. I don’t understand. How do you know that it can’t recognize the library without the errors written?
Thank you for answering.
The programm shows the library and all the functions related to accelerometer in black instead of orange whereas the rest are in orange. Isn’t that weird?
Nevertheless it compiles and I can upload the programm to the Arduino UNO Board (although the data adquired is really weird).
That’s really weird, it’s hard to suggest something right now, but answer to this question: do you really use GY-85 board? yes or no?
This is important because I had some weird data readings while using board with only accelerometer alone and I couldn’t find the solution for that.
Hi again,
I have been trying the code the entire day but i don’t understand the outputs.
I want the angles in degrees. On the one hand i get the rolldeg and pitchdeg between 20 and -20 instead 90 and -90 degrees.
On the other hand when i show anglegx, anglegy and anglegz i get signals which change even when the Imu is stationary.
I would like to add to your code a complementary filter but with those output data i can’t.
I have watched the video and it seems to me that output data shown are correct but i don’t get the same.
Could you help me please? I’m a little desperate because my project depends on an good measurement of the angles.
Best regards and sorry about the mess
If you would send me some pictures of the sensor wiring to arduino microcontroller and screenshots of error in the program maybe I will be able to help you.
Hi Edger , thanks for this great Tutorials ,
it had been a great help ,
i’m just wondering why you don’t had Audio in this Tutorials speaking for what you do and explaining it provide mush more help
Thanks for your reply, I really need a tutorial on how to use vuforia and cardboard SDK.
The app would scan the image target to track the AR world, and the user would be giving a button that when looked at will teleport the user into the VR world and Vice verser. just like the vuforia sample.
I am having the same problem after checking it in error this variable :
ADXL345 adxl; //variable adxl is an instance of the ADXL345 library
Arduino_AccelerometerADXL345_Servos:32: error: ‘ADXL345’ does not name a type
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void setup()’:
Arduino_AccelerometerADXL345_Servos:40: error: ‘adxl’ was not declared in this scope
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void loop()’:
Arduino_AccelerometerADXL345_Servos:46: error: ‘adxl’ was not declared in this scope
Hello! Student studying physical computing here. To run by “dropping in,” are there any constraints for dimensions/scale or file size for the OBJ model?
If your *.OBJ model size is huge, most likely, the model itself is really complex and has lots of vertices/polygons. I don’t know the exact constraints on this matter, it also depends on your computer specifications. You should sort this thing by experimenting with the models you have (if it’s really complex).
hello when I move the fast sensor, servo motor lock and does not follow the movement, and is making a noise (tec tec tec) in the servo motor, and then resetting it back to normal, you can say what can be ?
I’m using servomotor Towerpro MG946R direct 5v power by the Arduino.
Hello! good day!I am a beginner and i need your help please help me. I downloaded ur code ,i have the adxl345 3-axis digital acceleromter and 2 Servo motors. i connected the ryt connections u’ve indicated above. But like other reviews say, there are some errors….what should i do sir? and where should i start…..i dont understand the library stuffs u mentioned..what library is it?where could i find it??and where to copy
For starters, I didn’t make the app for this book, this is only a DEMO of what other developers did. I haven’t tried yet, but one of the users provided a link (Vuforia standalone) in order to make vuforia work on PC platform. However, I haven’t tried it yet and right now I can’t find the link it’s somewhere in one of the tutorials comments place.
Hi, I already made those examples but I have a question do you know about the Dragon Board 410c, I want to play mi app in those board but I installed the Ubuntu Linaro but my app doesn’t work with this … Do you know or do you export and play some of your apps with some other board ??
HI
i will buy the meta dev kit1 next Time. I have a question to your work
with which Prgramm you make this videos ?
On the Videos the FOV is extrem Big ! Is this alone on the video or on the Meta to ? I have a bt-200 an this device have an smale FOV on the near distance.
Have you contact with META when appears dev kit 2.
I suggest to compare the specifications of bt-200 and META glasses. I didn’t had the opportunity to use some other AR glasses (only META) so it’s hard to compare, but those who put those META glasses for the first time says that FOV is not so big. So, of course, watching video here, and using it for real is like day and night.
Hi ! Great tutorial but I have stuck on building part … I am doing same as you on movie but my apk is not working on android … it has black screen … I never developed for android and I belive I am missing something in setup. My projetct works great in unity preview, whole building process is ok to [no erroes] but after copying it on device and installing nothing going on. Can u point me in some direction. I need to run project on phone and all seems to be ok but this.
Thanks a lot!
It’s hard to say, but if it’s not “top secret” project send it to me and I’ll take a look. We can start from *.apk file. Maybe it’s a smartphone’s problem. Who knows. It’s also worth googling this problem.
Hi
Thank you so much… this is very useful to beginners like me.
I have a one question.
I tried this with two different object and tried on android device. The issue is my both objects are visualize from starting.. Buttons didnt work…
Did you put virtual buttons on textured tracking object? If not, I suggest to make it so. Don’t put virtual buttons on plane without any textures. I hope that helps.
Hallo,
thx for your answer.
I think Meta brings in the next Time a new Version of their glasses out. I think the name is DevKit 2.
Have you any information about this Release ?
I can see that you’re not using Win OS so it’s hard for me to suggest something. Also, you’re using Processing 2.1.2 not 2.2.1 but I don’t think that this causes the problem.
Hi, thousand thanks for your code! i tried several not working but this works so perfectly!
Btw, I will need to interface with two sensor, I dont know how to change the code to measure for two, could you please help me please?
No, I can’t. This is raw data readings from the sensor.
alright.thanks anyway
Hi,
i’m and Interactive Developer using Unity3d, i sincerely need your help on how to integrate Vuforia and Googlecardboard sdk in Unity 3d for Architectural Visualizations.
Hello, John. Haven’t done anything on that yet. But I will soon, you can wait for a tutorial (2 to 3 months), but if you want it to be quicker it won’t be for free.
But i received at the time of program run. It couldnt find the path of ICSharpCode.SharpZipLib.dll to Unzip the file. I couldn’t understood. I am not programming guy…
Can you help me?
Is Csharp is language supports Andorid?
Sorry it is difficult to me to understand script and do something on it..
C# is supported by Unity and using Unity we can export an app for Android devices (and other OS’s). I really haven’t stumbled upon this problem, can you copy-paste the directory to your project files. Have you tried anything simpler up until now? For instance: http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ I suggest you start from here and then move on.
Can pls share d code?
Actually i needed the code with which u had drawn the rectangle around the maker and how to return the coordinates of corners of the marker.
Well, actually there is, but it would be more or less a “workaround” that I would not suggest to use. Now what do I mean? basically, you would need to export your model animation for every frame. So most likely you would end up having lots of *.obj files, where you would need to load these models frame by frame in void draw() part.
Option 2: search for a library that would be able to import animation file (*.fbx file extension). I couldn’t find for a better solution back in these days, but maybe it is available now. Who knows…some research is needed.
Option 3: Try doing AR tutorial No 14 that involves Unity3D and Vuforia, you can add animations there quite easily.
Thanx for your advice, actually i went through all your videos and the unity ones are quite cool . But actually i am newbie for unity and coding any logic into it is becoming difficult.
But i will try on the fbx part you told.
hi thanks for all tutorial ,
I made my ar apk unity+vuforia my mesh model is from my own sketch up models (import fbx to unity). when i try on pc is workin fine, but when i installed it on my smartphone it works but slow response and lag. any suggestion for size ar apk? (mine is 187kb)
next question… how to make quit button on app ..? thanks ..
now I’m develop ARVR app, is this snapshot button just to snap on-screen interface ?
is this work on stereoscopic interface ? i want it just a monoscopic(single) image saved.
is it possible to give us any info or hint about the making of this demo or can you please mention any reference can we used to learn scripting in unity and achieve same results
yes I know of course you using distance script……thanks……but the main part it’s how to modify the distance script parameters or how to write a new script me by using a playmaker or something else
what is the best way to learn to script in unity
Hello,
Is it possible to turn the buttons into virtual ones that can be pressed with your hand like you demonstrated in a another tutorial (No. 19)? I don’t succeed when I try to merge the code UI from this tutorial and the one of Virtual(Vuforia) buttons…
I thought it’s just because I’m a lame coder… Is there a way to extend interactions with virtual buttons (Vuforia-Unity)? For instance, jumping to a next scene or playing a video?
Btw, thanks for all your tutorials, they are very helpful.
Yes, there is a way. The same way I switch models in this tutorial (http://www.himix.lt/augmented-reality/augmented-reality-virtual-buttons/) you can add different functions – load another scene and so on. Just dive in the code “VirtualButtonEventHandler.cs” (starts from case “btnLeft”: and case “btnRight”:)
I will try until I make it, thank you very much!
There’s one more issue no one could answer at vuforia’s forum: is it possible to trigger a whole environments in which the viewer can dive?
Should I use extended tracking (the triggered image would be much bigger than the marker) and keep the image target active even when tracking is lost (can I just disable this function to keep it on even if I turn the device/camera in other direction)?
I’ve read about different plug-ins like Unified Coordinate System that could help building augmented environments… Could you point a direction I should go?
Cheers!
It is possible, but I haven’t done this in Unity. Basically what you need I did it here with MARG sensor (http://www.himix.lt/arduino/arduino-and-virtual-room-using-mpu-9150-marg/), just with pictures and without tracking any image target. Same stuff applies to smartphones and tablets. I haven’t heard anything about the plugin you mentioned.
hi.. how can i connect it to the ethernet shield?? and what could i established for an ouput device using flame sensor?
I am serious, currently there’s probably even a newer version. This asset was downloaded not from unity asset store, but from leap motion companies website.
Haii .. Thank you so much for this awesome tutorial . By the way, can a particle system be controlled by our 3d object instead of using arduino ? For example, when i click 3d object such as factory, the particle system for an example, smog will emitted.. im trying to make an interaction with my AR project. Really hope you can help me, tho. thank you.
can i implement the app on tablet connected to external camera ????
and how you track each part of your body???? can Kinect distinguish each part and give it a tag?
I don’t know whether external camera can be connected to the tablet, I haven’t tried to do so.
And yes, Kinect can distinguish different parts of your body, I mean track your body parts/joints, its position and orientation.
hola soy de peru . he vist o los tutoriales de unity3d sitio web y los scripts no son correctos .. gracias admin por la ayuda que nos brindas en tus tutoriales por que los ejemplos de unity web sites no me resultan nada
“hello i’m from peru. i visit the tutorials of unity3d on the website and the scripts is not the right… thanks admin for the help that you gave to us on your tutorials because the unity’s examples in the website not help me.”
-Something like that!, sorry for my bad english too!
I’m amazed, I have to admit. Rarely do I encounter a blog
that’s both educative and entertaining, and without a doubt, you’ve hit the nail on the head.
The issue is something which too few men and women are speaking intelligently
about. I’m very happy that I came across this in my hunt for something regarding this.
Hi, I have followed your tutorial and everything works fine in play mode but I cant build project on android.
Worth to mention that I configured everything like you show on money tutorial and that project works fine … can u help in this matter or is text rec not working on android ?
Tkanks in advance.
Bart
May I ask, what if there is two model in 1 scene? Because your tutorial only have 1 model in 1 scene. Then, what happened to the tag? Can I tag both of my model as Model?
I have tried set both character same tag name as Model, but it does not work.
What happen is only 1 character that scale up and scale down when I click the button, the other one did not happen anything.
I’m creating a scene that has two character, one person performing CPR and the other person is the patient. I need both character to scale up and scale down at the same time when the button is clicked.
How to create or how to use already created models?
You can create models using 3Ds Max, Maya, Blender, SolidWorks and lots of other 3D modelling tools.
How to use it? you should put the model in data folder and change some Processing code (you will find out if you watch closely).
How do I reset the animation if the target has been detected? It seems that the Animation just pause/continue playing if The target has not been detected.
hi, great tutorial, so thank you, you help me a lot learning AR, can you help me with one question when i installed it says ok the moneyar.apk but when i open it, the screen get black, and nothing happend 🙁
Great Tutorial thanks, but i want to ask question i developing a vuforia app,
but when i taking a screenshot using your code my result is only white screen and the Augmented view only with white background can you help me pls?
sorry for my bad english, thanks
Hi, I did the same way as u did and everything worked. Just that the UI buttons will be on screen even when the image is not tracked. The UI Buttons would just stay at the last place it was tracked. I wanted to do when the image is tracked the buttons appear where the should and when is not tracked it would disappear. Please help. Thanks in advance.
Hi first thank you for your job ! sure i have the same problem i did everything but the UI buttons are on the screen doesnt matter, if canvas are in or out ImageTarget the result has same the ui button are always on screen.
Please help. Thanks you.
I have the same problem as Nqb..I have tried making canvas as child of Image Target. But still canvas is rendered onto the screen even if image is not tracked. Can someone please help me out?..Thanks in advance
Hey could you suggest any tutorials for using real world marker input instead of virtual buttons? The plan is to make an application which reads the position of a real world marker and responds based on it’s hovering over a real world button which is printed on the paper page instead of virtual buttons.
I got same problem with Nqb, and i already put the canvas with buttons inside image target, but the result still same, the button still popped out when the marker is lost
hello, thx for the tutorial its helpful 🙂
as far as i understand the primary surface used to track the scene which could be as the size of a dinning table, what if there is 5 image targets on 5 different places ( not far away), would that extend the size?
example: lets say i did put 4 image targets at the edges of a table, would they all be tracked at the same time and the “props” would be the same for both? , or each target would define its own scene and props?
This is something that you will have to test on your own, but I would say each target would have separate “props” not the same. This is my logical guess.
yes man, on pc i can see the button and the panel, on phone it seems like they’re invisibile, when i tap randomly on the screen tapping them, they work but i can’t see them!
Hi, I downloaded your project, and when i exported it for android I can’t see the button on smartphone, but when i download your apk,works, what’s the problem?
I have restested just right now (exported the apk). Works great actually.
Hi! Thanks in advance for your work! I tested and worked very well.
I’m looking at the code and I have some stupid question to ask you. I would be glad your feedback. (sorry in advance for the dumb questions.. but I’m not an expert in arduino)
0.1- the raw output of the acc is what? voltages?(from readAccel)
0.2-the raw output of the MM is what?
0.3-the raw output of the gy is what?
1-line47: reading gyros acc and mm, you have a FOR loop of 201. May I ask you why?
2-line88: why 255? there is a pre-set offset of 255deg?
3-line89: why are you dividing by 256 the raw data?
4-line92->95: I don’t understand what u are doing here.
5-line113->115: why is divided by 14.375?
I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.
Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?
Have you ever tried to implement it on arduino uno?
I had rumors that is impossible due to limited memory?
Thanks in advance for your kind answers,
I really appreciated your wonderful job! It works nice?
p.s. do you have an oscilloscope for dumb Macosx users?
1. Concerning all first questions – look up some theory on the internet how it works and read sensor datasheets it will answer lots of your questions.
2. “I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.” – show me some proofs of that “drastical” improvement. Quaternions already have smooth representation.
3. “Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?” – correct
4. “Have you ever tried to implement it on arduino uno?” – Yes I have tried it.
5. “I had rumors that is impossible due to limited memory?” – Wrong, there’s enough memory.
6. No, I don’t have oscilloscope for Mac.
I did everything perfectly but when I rotate the object, it does not rotate on the Y axis but it makes a combination that continuously sends it down making it impossible to orient. Why? How can I fix it?
I know this might a stupid question but I noticed when Opening your Project that I downloaded In this site unity immediately open the “Game” tab and the “Scene” tab is missing I was wandering on how did you do that. Thank you so much on this tutorial reaLly learned a Lot on this experience.
Thanks, I’ve managed to find a solution on these forums! But I’m now facing another issue as I have multiple targets. It works great, until I click the camera button and track another target : the share button from the previous track still appear… Is there a way to restart/disable your script while on OnTrackingLost?
I ended up duplicating your script and calling the matching canvas for each ImageTarget. I don’t know if it’s the best thing to do but it is working! Sorry for the bother and thanks again for your tutorials!
After tests on several devices, I am facing few troubles on a tablet using Android 4.4.2 :
– If I take a screenshot of an ImageTarget and share it right away, my app restart.
– If I take a screenshot of ImageTarget A without sharing it afterwards and take another one of ImageTarget B right after my app close.
It is working great on smartphones using Android 4.2.2 and 5.1.1 though, any idea what the problem would be?
Hi,
I downloaded the script file and loaded it directly in my scene. It didn’t work. Both the buttons and models appear as soon as i enter the play mode.
Later i also tried changing the names of buttons and models according to what I have named them on my scene. It now shows shows me an error to fix the compiler.
Could you please guide what all attributes are to be changed before loading the script.
Hi, your tutorials are very helpful great job! Got one question is it possible to take a snapshot with interface graphic elements? In my case the snapshot is working but without augmented layer. I’m working on simple app with OpenCV ForUnity. Maybe I have to change the camera name in your script?
Thank you and pleas keep your tutorials coming!
Hi,
I run this code with GY-85 BMP085 sensor. ADXL345 library is in black instead of orange in the program. Also, at the serial monitor, the values does not change and they are always “0.00, 0.00, 92.50”. I don’t understand why it is. I need help 🙁
I actually know what your problem is. On my code, ADXL345.h is also black, but it runs just fine. I get the same problem if I move the libraries to a wrong location.
So to explain, When my code works correctly, I have my main folder labeled “Arduino”. Within that, I have a unique folder for each “.ino” file, labeled the same way as the “.ino” file (minus the .ino) and a folder labeled “libraries”. All of the libraries go in the “libraries” folder, then are saved in another folder titled with the name of that library followed by “_library”. For example, it goes:
Arduino>libraries>ADXL345_library>”all contents of that library”
I have the problem where my serial monitor values are always “0.00, 0.00, 92.48” if I move the libraries from the “ADXL345_library” folder to the “libraries” folder.
I don’t know if that actually makes a difference but if it was the same problem so hopefully this helps you fix it!
Your tutorials are great!
I learned so much from them. I watched almost all your AR tutorials and executed all the projects!
I had a lot of fun watching and learning.
Thanks a lot and keep up the good work.
Please post more tutorials 🙂
It’s an awesome tutorial,but i was undering if it is posible to add adition text to user’s one, for example: he/she wrote: I like this games,and in the end an stabil #CompanyName?
Video with a new marker can not be played. The video with the marker provided can be played. Do I need to upgrade to Unity Pro to play the video? Thanks in advance. 🙂
With nyar4psg you will be able to track only square black markers, of course, you can make it your own, but nothing alike images. Such marker-based tracking won’t be so robust.
I did everything step by step but my videos wont play.
It shows up but it gives me the x image and when i click on it it gives me the loading image forever.
How do i get my video to actuall play?
Hey i tried doing the same…. but my Unity Crashed when i am trying to add ImageTarget. I am using Unity-5.3.3f personal. Can You tell the version of unity you using. So i can follow your video’s.
Hai Admin,,,
I try processing in Ubuntu but I don’t know how to import library Nyar4PSG,, I try to create and copy in ~/Documents/Processing/libraries but it doesn’t work correctly
Thank you for the tutorial! I am working on a app that is going to be using Text Reco and Cloud Reco. I have a couple questions that I am hoping you could answer. For starters When I run it on unity I the space that can actually read the text is really small and not that forgiving whenever I move the text. I was wondering if you knew of a way to make where it reads the text larger/ more forgiving when the target/Phone moves? Also I was wondering If you knew anything about cloud recognition, I tried using the vuforia tutorials but they are out of date and no longer work and the newest tutorial I can’t seem to figure out either. I’m assuming i’m messing up in some sort of way because when I look online nobody else seems to struggle. Any input would help! especially with the cloud Recognition if you can, thanks!
1. Virtual buttons is not working unless i am focus on to the button.
2. With out touching button it changing model base on my camera movements .
3. I changed max simultaneous tracked images 1 to 4 (each separate build in my Android mobile) .
4. Virtual buttons Sensitivity setting also changed from HIGH to LOW (each separate build in my Android mobile).
If you want i will send my Unity package file link also.
What steps need to be perform if we want a video playback on cylinder target ? I want to see a video, on cylinder like object instead of flat image marker.
I have already achieved to display a video on Image target.
In Image target case, we upload our marker image to developer portal database, but for this case assume that image marker is a sticker which is attached to bottle. I want to see video as I scan the sticker.
So, shall I upload that image target as a cylinder target image in developer portal database. ?
And what would be hierarchy inside unity project ?
In case of video playback on target image:
– ImageTargetStones (Parent) contains ImageTargetBehaviour.cs
– Video (Child of ImageTargetStones) contains VideoPlaybackBehaviour.cs
What would be hierarchy for diaplaying video on cylinder ?
Hi !
I did everything like you with Unity 32bit but when I click on start and show the target in front of my webcam, the 3D model doesn’t appear in AR..
hello,
thank you for this tutorial.
i try to this video, but i have a problem.
Assets/script/SnapshotShare.cs(7,17): error CS0246: The type or namespace name `AndroidUltimatePluginController’ could not be found. Are you missing a using directive or an assembly reference?
this error appear , did you know why ?
hi.. i just bought the plug in. But smhow i faced the same problem with sh. (i am new in this)
The same problem here.
Hey.
nice tutorials. really helped.
one doubt though. What is the basic difference between markerless and marker based AR. I tried searching it but I’m still confused. In this case if we are adding the image beforehand then how is this markerless AR?
would really help if you solve my doubt
In marker-based tracking we track only black square markers. In markerless solutions we can track image targets, faces, hands, fingers, finger alike objects, bodies, etc.
Great…I need this tutorial thanks a lot.
Can you give more tutorial to rotate the car with button left and right ?
i already make the udt but i can’t rotate the object.
Thanks :)))
Hi, I still don’t know how to install distributed library into the program by using nyar4psg, I have google a lot but I couldn’t find out any thing, could you show me how?, thanks.
Hi Kiran,
Did you find out how to play video continuously when target is moved out of camera’s focus? We would like ideally to use Vuforia to only trigger video player, so it comes out of the image and turning/moving towards the screen finally getting into the place. Once it’s in the place we can touch play button for video to play in the full screen mode. Also would be nice to close finished video and return to the targeting mode to trigger another video from different image. Any help would be greatly appreciated. TIA.
Only video preview in full screen mode would not depend on the tracking state.
About other needs – there is no easy description how to do so, you just need to code, but I don’t think you’ll be able to have some additional buttons (from your side) when the video is in full screen.
thanks a lot for the tutorial. I’m having the same webcam problem, where it seems like you need 32 bit version, on the latest unity version there is no 32 bit version what I could do?
Hi … wonderful tutorial. However, I followed all of the steps you explained but the camera can not detect the 3D object even on the textured sheet. Can you please help me with that?
Hey, i have the same issue. I followed the guide and implemented the app on an nexus 7, but the object can not be detected. I did not do any modifications on the code, so i don’t get it why it is not working.
hi, i followed your tutorial, and i couldn’t find AppManger.cs and SceneViewManager.cs in /Asset of Unity.
could you tell me how and where can i find it.
J espère que ce projet sera possible sur smartphone et vous créez tout les monstre de yu gi oh car sur l application ( androdisc ) il a été seulement 60 monstre et j espère de savoir comment je peux construire ce demo et merci
So basically you mean i just have to put my video in the appcontent folder instead of your augmented_reality_technology?
By doing this can my video play instead of the video provided by you?
I tried it but when i play it through mobile the moment i click on the screen to play the screen goes black…any specific mistake that i am doing ? can you please tell me?
I have tried your tutorial but when I moved the object away from the camera, the interface stays on the screen but in angled position. How do you fixed this? Is this something to do with the script? Please let me know, thanks.
Same here! Would love to know how to fix it. So It can only pop up when you point towards the track. Already tried putting the Canvas inside Image Track, and it does not work. Thanks for the tutos!
thank you for your tuorial, it’s very helpful
i followed your tutorial, but i i’ve an error like this :
Error attempting to SetHeadsetPresent
UnityEngine.Debug:LogError(Object)
Vuforia.VuforiaAbstractBehaviour:SetHeadsetPresent(String)
Vuforia.VuforiaAbstractBehaviour:Start()
Hello , i download the source code and tried running it on my android device . However , onlly 2d ground image is being displayed on tageting at image target . Any clue why that might be happening . Please help soon as possible . Thanks
Hi Edgatas Art,
First of all thanks for the tutorial series. I have a question that in this case I think that although we are moving the tracking image by our hands but it remains stationary in Unity scene and it seems that the AR Camera is moving.
What I want is as I move the tracking image by my hands. I want the 3D object placed on the image to move along with it in the 3d space.
Hey man, i’ve tried this tutorial and it works, but now i got a problem, the warning says “trackable userdefine lost” and the object doesn’t show up when i click the button.
can you tell me how to fix this.
Hello Edgaras,
thank you very much for your tutorials.
I tried this with my own video and it works perfectly.
I also changed the orientation of the video by Selecting VIDEO in Hierarchy and changing the X Scale value from 0.1 to -0.1
I have a problem when I pause the video and play it again: the music start from beginning but the video remain blocked.
Where is the problem? Maybe because I stream an MP4 video instead of M4V?
Hello
I am a fan of your page
In Tutorial No. 39 you put some jpg images as example .
How do paragraph colcoar OTHER jpg images is no unity ?
Put some tried but HE DID NOT accepted
when i import videoplayback package, i got these error:
Assets/Common/MenuOptions.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/AsyncSceneLoader.cs(7,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/LoadingScreen.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
I’m just building the unity 3D project on my Samsung and the panel and buttons are not appearing. They appear when I test it using UNITY 3D but not on my phone.
Do you have any idea of what could it be?
Eres genial, tu contenido es digno de una clase de maestria, este juego esta demasiado bien y tiene mucho potencial de diversas maneras, pero creo que te hace falta el manejo de marketing digital. si necesitas ayuda con eso yo se un poco jejeje, espero que sigas haciendo este tipo de contenidos y espero que tus proyectos sean un exito.
If 3d model are rigged then you can do that with static model (without animation). Head move together with head bone. For all model (parent gameobject) left/right rotation just use one of the rotation methods (RotateAround, eulerAngles). 🙂
please I’m student and starting my GP i need help i wanna know how to start with augmented app with android devices i wanna use android studio , step by step my idea is face tracking too
can u plz help me ?
I have the same problem as:
” I did everything step by step but my videos wont play. It shows up but it gives me the x image and when i click on it it gives me the loading image forever” also I can’t find this file “AppManger.cs”. Any idea I use the latest unity and Vuforia plugins
I noticed there were questions on the videos being inverted upon tests. Mine is doing the same. I have tried all suggested. Can any help regarding where the proper axis change is made?
Current setting for ImageTarget is: X -0.1 Y 0.1 Z 0.1
hi, thanks for your tutorials. These tutorials are great help for beginners.
I’m facing a small problem please guide me through, when i press arrow keys player animate perfectly and rotate also but didn’t move physically on plane, animate only on fix point.. Thanks in advance 🙂
Your website is awesome. I discovered it like several months ago, but always thought that this requirement of having a target image is somewhat cumbersome. Thank you very much, sir!
I ‘ve tried to make video playback like this. but the unity said that “IsampleAppUIEventHandler” cannot be found. it’s because I dont have that file in my project. so where I can get that file ???
FIRST congratulate BY ITS TUTORIALS this note 1000, I’ve been doing this now put in my Unity 32 and 64-bit generate an error when starting the camera , already put the api key editor and still generate an error with the name.
I FOLLOWED YOUR TUTORIAL.ITS VERY EXCELLENT. BUT IAM NOT ABLE TO CONTROL THE ANIMATION.IN GAMW VIEW ITS VERY LARGE.CAN YOU PLEASE EXPLAIN HOW TO CONTROL THE ANIMATION?
Thanks a lot for such useful and detail instructions! I’m just starting exploring how to create AR with Vuforia and Unity. And these tutorials definitely come in handy 🙂
I tried to follow this tutorial. But unfortunately there no such property for a button (like in your video 7:11). Here’s a screenshot what I see: http://prntscr.com/c5cqi4 . There is no init() function. I tried to use start() instead but it didn’t generate that 2nd script where you change some code (from private function to public).
I’m using Unity 5.4.0 and Vuforia 6 (tested on v5 as well).
Can you please explain me what I’m doing wrong and how to fix it? Thank you so much in advance! Hope you’ll find time to answer.
Keep up doing awesome things! 😉
Failed to load ‘Assets/KudanAR/Plugins/x86_64/KudanPlugin.dll’ with error ‘操作成功完成。
‘, GetDllDirectory returned ”. If GetDllDirectory returned non empty path, check that you’re using SetDirectoryDll correctly.
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:203)
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:196)
Kudan.AR.KudanTracker:Start() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:220)
hi, may i know how caracter look each other? are you using LookAt in unity or what? becouse i want my caracter look each other , but still not found how.
Hi There, I am playing around with this and am wanting to have 5 pages instead of 3. For some reason when I add two more pages, the swipeimage script seems to malfunction, not allowing me to swipe at all. Any thoughts? I adjusted all the parameters I could think of to account for the new pages but I didn’t mess with the script at all. Would it need modification? It didn’t seem like it should…
hey, thanks for the tutorial…but the share button does nothing and all the other buttons work. I bought the plugin and followed the tutorial, is there a permission I should be adding or something has changed?
I have followed the instruction as above. However the plane could not automatically disappear unless I clicked it. After I click to disappear the plane, the cube or sphere is not appear. Please give some advice.
I am using Unity 5.2 and vuforia SDK 5.5.9 .
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
When i scan a plan using camera and loading bar, model is getting loaded in another plane. Is there anything i would have missed or got messed up? I followed your tutorial clearly!
I follow your tutorials and they are great.
I have a problem.
I am using unity and vuforia (user defined target).
I am recognizing objects as targets (followed this tutorial), but my virtual 3D object and canvas are unstable, and the extended tracking doesn’t work like in image targets.
Have some experience with this, does this happened to you sometime.
I will explain, I have a sculpture to recognize, and I tried AR-Media object scaning solution, but is the app becomes slow and also unstable, that is why I am using user defined targets to overcome my problems with object recognition.
#region PUBLIC_MEMBER_VARIABLES
public string TitleForAboutPage = “About”;
public ISampleAppUIEventHandler m_UIEventHandler; (The type or namespace name ‘ISampleAppUIEventHandler’ could not be found)
#endregion PUBLIC_MEMBER_VARIABLES
#region PROTECTED_MEMBER_VARIABLES
public static ViewType mActiveViewType;
public enum ViewType { SPLASHVIEW, ABOUTVIEW, UIVIEW, ARCAMERAVIEW };
I am running NyAR4psg/3.0.5;NyARToolkit/5.0.9 in processing 2.2.1 with a Microsoft LifeCam HD-5000 on windows 7. When I run simpleLite, the background (camera) image appears only in the upper right corner of the window. It shows the lower left of the camera view. If the background image was correct the tracking appears to be correct. I looked in the reference material and found public void drawBackground (processing.core.PImage i_img)
This function draws the PImage to the background. PImage draws in part of farclip surface +1.
This function is equivalent to the following code.
:
PMatrix3D Om = New PMatrix3D (((PGrapPGraphicsOpenGLhics3D) G) .Projection);
SetBackgroundOrtho (Img.Width, Img.Height)
pushMatrix ();
ResetMatrix ();
Translate (0, 0, – (Far * 0.99F));
Image (img, -Width / 2, -Height / 2);
popMatrix ();
SetPerspective (Om);
:
My approach was to sub this code in for the line “nya.drawBackground(cam);” then mess with the translate to correct the issue. But I get a “syntax error, maybe a missing semicolon?” – I added a semi-colon to the end of the second line SetBackgroundOrtho (Img.Width, Img.Height); and It still hangs on the first line with the same error.
Any help would be appreciated.
Hi,
Please help me.
I downloded the Augmented Reality Vespa User Interface – Mimic No. 1.. Really this is only interface. So I don’t test the projekt.
Where can I download the motor image?
Hi, i need to know if i have to buy a 3d sensor camera for built a game with smart terrain or i can use the traditional camera of my smartphone? Thank you
Thank you for providing this nice platform, We are looking for a really good developer who can develop this paint functionality for us, We are already working on our product and need to integrate that part in it ( we are using Unity3D, Vuforia, C#).
The basic requirement, app should recognize/read the colors from the marker and apply it on the model itself.
Looking forward to hear from you soon.
Regards
ABID
P.S. I’ll be submitting few cool AR demos to this site, very soon 🙂
I’m having an issue with the screenshot aspect ratio. When I take a screenshot (in landscape or portrait mode), the image comes out vertically stretched (or horizontally squished). I tested it in 3 android devices, same in all 3. The images come out normal when i take a screenshot in unity on my mac.
After a lot of research, I still can’t figure out the cause.
Do you have any suggestions?
Thank you so much for the tutorial! Really appreciated it. 🙂
But anyway, do you have any idea on how to reset the distance value once it is on “OnTrackingLost”?
Cause everytime I need separate it first(during scan the object), then the particles effects will be destroyed.
Or else, it will still remain on top of Image Target when I scan for the second round even I didn’t connect the paper.
I would be greatly appreciated if anyone could help with this problem. Thank you! :)
When I scan only one part of the image alone for the second round, the particles still sticks to the image even I didn’t pair up with another image target.
I’ve tried few solutions, but it seems too many errors come out,
One of it, I tried put parts of these inside OnTrackingLost() section,
”
string NameTarget = “imageTarget_” + mTrackableBehaviour.TrackableName;
GameObject target = GameObject.Find (NameTarget);
transform.position=new Vector3(0,0,0);
”
in order to reposition the sphere back to normal position when tracking lost, but it seems like not working cause I’m not pretty good in c# coding,
Hey Guys, i have used your tutorial to make the simple video playback app, and its working great, i just want to know, how we can change size of video appearing after tracking the image target ?
Please i need help
Hello, I have follow all the tutorial properly, but when I connect my laptop to kinect, the picture won’t open. I don’t know what happen, do you know to solve this problem?
CS0246 C# The type or namespace name “AndroidUltimatePluginController” could not be found (are you missing a using directive or an assembly reference?)
I try to run your project in unity 3d its amazing! Thankyou. But when I built it to an application, it can’t detection my webcam. Do you know why? Please give me an answer, thankyou
context.enableUser(); when playing this sketch at this line this error showing “The function”context.enableUser();” expects parameter like”context.enableUser(int);” ”
Please help me to retrive
I never success create AR files using vuforia and unity.. i use desktop pc dont have any camera can i do it with this specification Desktop PC, win 10, 16 gb ram
Assets/VirtualButtonEventHandler.cs(5,14): error CS0101: The namespace `global::’ already contains a definition for `VirtualButtonEventHandler’ What about this Error……………?
hi sir
your tutorials are great. thanks for uploading…
can we integrate 2 or more videos with single Image target and make next and previous buttons to change between videos…
is it possible ?
thanks in advance
Hi there,
Thank You for the Video
I am new using Unity and all these stuff
I followed each step
But I had an error after i removed the Utility folder.in Minute 10:34
This is the error:
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(19,13): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the
Has this project’s code Kalman Filter?
This one – no, and the sensor information is not fused in any way. Actually, I would suggest using quaternion implementation rather than Kalman filter like here http://www.himix.lt/?p=915 (sensor fusion is done here), when you use the array of 3 sensors (accelerometer, gyroscope and magnetometer). The use of Kalman filter would not provide noticeable improvements over quaternions (I’ve did lot’s of experimentation).
BUT! If you use only one sensor, for instance, accelerometer, I would recommend using Kalman filter.
I’m planing to make KF tutorial in near future.
Does this work for Kinect ONE?
Unfortunately, no.
RFSPI.cpp.o: In function `SPIClass::begin()’:
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:24: undefined reference to `SPIClass::pinMode(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:25: undefined reference to `SPIClass::pinMode(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:26: undefined reference to `SPIClass::pinMode(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:28: undefined reference to `SPIClass::digitalWrite(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:29: undefined reference to `SPIClass::digitalWrite(unsigned char, unsigned char)’
C:UsersMathiDocumentsArduinolibrariesRF/SPI.cpp:30: undefined reference to `SPIClass::digitalWrite(unsigned char, unsigned char)’
Please provide more info/details on this error. What did you do, did you copied the library folder into proper folder etc.
please…,make a pdf tutorial
Sorry, but I am not planning to. I’m thinking of writing an e-book, but it won’t be for free.
Yes!! An ebook would be amazing !
Can you please provide me with the details and pacakges of it.. rohit.gupta2267@gmail.com
Sorry, but I do not sell the book, you can try and search it on ebay, there’s plenty of other similar books related to augmented reality technology.
I dnt want book. I want the packages and that u used. Like games and all.. If you can provide them it will be really helpful.
This book goes together with DVD disc in which you find the Augmented Reality software. It might be games, it might be some other exciting things related to Augmented Reality. So this is only DEMO of the book with games that someone else developed.
I want to make final year btech project on agumnet reality so i need such things. You have also mentioned about the ultimate project, what is it ? Can you help me in building a project with me ?
I can tell you that Ultimate project won’t be available for free, it is something that I’ve been working and improving for several years, it’s a combination of Augmented Reality and Arduino.
Tutorials – is the way that I help. You can’t find anything that would fit your idea? What is your idea?
My idea is i want image recogination and text recogination connected with internet. Like if i want to know about a book , i just point my camera on the book cover and it will tell me the reviews. Its basically a part of Sisth sense technology developed by pranav mistry.
Image and text recognition is basically solved in my tutorials, but it is all predifined (images, text). If it is all predifined it would be easier. What worries – the search on the internet. But it might be that you want a little bit different application, for example, take any books cover picture, recognize it properly and make a search on the internet?
Yes exactly. By seeing weather it tells you about the weather.
Well, in my opinion, this is ahell of a work and I’m not sure how to make it possible.
the video shows just codes to read data from photo-resistor what about the image processing and other stuffs
Watch closely video, there are two parts, one for arduino and photoresistor, the other one for Processing and Augmented Reality while acquiring photo-resistor information.
what library does the processing software needs to run these codes
Nyar4psg, please watch Tutorial No. 1: http://www.himix.lt/?p=512
Hi! I’m new to AR and Unity but I have been a software developer for over 10 years. Thank you in advanced, I will try your tutorial. It looks like a lot of fun! 🙂
Thank you so much for the tutorial. It was really hard to find a working tutorial for the virtual button!
I’m glad I could help!
Hi and thank you! It works 🙂
https://www.youtube.com/watch?v=oHVXVJKUM6Y
Is it possible to control the mouse using (a-star 32u4 micro – a tiny arduino leonardo clone)
and MPU-9150 ?
http://fr.hobbytronics.co.uk/image/cache/data/pololu/a-star-32u4-micro-2-500×500.jpg
https://cdn.sparkfun.com//assets/parts/7/3/7/6/11486-04.jpg
I don’t know if it is a clone as you say, it might just work out for you. Just try it out and let us know.
It’s very straightforward to find out any topic on net as compared to books, as I found this post at this website.
Man,thanks a lot for this videos,i’am just a noob in AR and your videos are helping me a lot.
I’m glad!
Hello very good job I would like to know when you will go up the project I am waiting .A greeting and thanks
Hi, I’m working on it…
Brother can you help me in making virtual dressing room ? If we want to try any dress which is available in any of the shopping website.
Hi, I haven’t started this tutorial yet. But maybe you have some clothes I could use in the future for this tutorial? I won’t need to search it by myself.
Hello Admin , Is it possible to make a virtual room to try accessories 🙂 ?
If you ask generally, then of course it is possible. If you ask me will I do it? I will, but I can’t tell you whether it will be available as tutorial or DEMO. In the future we’ll see. 🙂
Will you do it for me if i pay you?
sure
hello sir, i m a student i want to develop my own project which is interior designing using marker less augmented reality how can i do that can you give me a demo on that topic which i can use as a reference for my project…. please give me a demo on interior designing using augmented reality…
actually i m very confused so it would be really great if you help me out so please let me know how it work through your demo for interior designing augmented reality… i have prepared the marker based interior designing by taking reference of your demos but now i want to do it in markerless sooo please give me one demo on it please
This is where I suggest you start: https://www.youtube.com/watch?v=qfxqfdtxyVA
This is a markerless AR. Just start from adding your interior design content. No need for a seperate tutorial on this.
sketch_aug14b:32: error: ‘ADXL345’ does not name a type
sketch_aug14b.ino: In function ‘void setup()’:
sketch_aug14b:40: error: ‘adxl’ was not declared in this scope
sketch_aug14b.ino: In function ‘void loop()’:
sketch_aug14b:46: error: ‘adxl’ was not declared in this scope
pls any one say how to solve this error…?
i just download the library and paste it …still im getting this error… what i have to do…
reply as soon as possiable
Are you sure you copied library in the right direction? Could you paste me a path to this library?
After you copied library did you restart the Arduino itself?
suppose i want to take run time image directly to my application and user can place wherever he want how can i do tht ?? which platform will be suitable for my project?? suppose i want to take image from a online shopping site as input to my application and as output user will see how that interior look like..
Just to make it clear, you’re talking about the app using META glasses, right? By saying “runtime image directly” you mean that take a snapshot from camera with augmented content, and place this taken picture in any place you want in augmented reality?
Hello very good job I have this
Error DllNotFoundException : MetaVisionDLL
Meta.CanvasTracker.Release ( )
Meta.CanvasTracker.OnDestroy ( )
Would know how to fix it . Thank you
Sorry, haven’t stumbled upon this problem.
U used 64 bt unity!. use 34 bit version.
I would also like to try it on android mobile augmented reality Vuforia . Please help
So what’s the problem? I would start from here: http://www.himix.lt/augmented-reality/augmented-reality-android-app-export/
METAPRO GLASSES ARE REQUIRED TO PROVE THE SCENES ?
DOES NOT WORK WITH WEB CAM ?
Correct
Hello, I was wondering if I could use this library on a 2-axis accelerometer. I will download the library now and see if you utilize function overloading so that I can pass in only values for the x and y axis; but if not, do you have any ideas? I have a 2-axis accelerometer that is hooked up for I2C ONLY. Please let me know if you have any suggestions or advice. Thank you. -Joe
ADXL345 is a 3-axis accelerometer and using the library provided, you should not have problems in acquiring that information. I2C works perfectly for that.
Is there a tutorial for scanning a single image.
What do you mean by “scanning single image”? Recognize and Track? If this is the case, the basics are starts from here: http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/
an amazing job your tutorials are excellent I would like to know when we have available the project. Thank you so much
Soon, but it will be only as a DEMO, not something that I will share. At least for now.
Instead of making confusion with words this is what i want to create an android application for interior designing with augmented reality….
https://www.youtube.com/watch?v=ipkz6y9mfvk
…. i want to create an android application in which the user can buy the furniture from the online shopping sites and by using my app user will be able to see the augmented view of those furniture at their place and if they like it than they can purchase it… i hope this time i m clear what i want to say…
Yep, it is crystal clear. You will have a hell of a work to do.
hmm ya i know its not so simple n so easy.. but i want to do it.. not for a particular output but for sake of knowledge please can you guide me for this?? i will do my best.. just guide me i will work hard on it… please..
My guidance is the tutorials I make. Just be patient and I think fragment by fragment you will be able to build what you want.
okay thanks… but i have to submit it as my college project so let me know from where should i start??? which platform will be suitable for my project and what should i do first?? just let me know that means i will be able to start my work …
Start here: http://socialcompare.com/en/comparison/augmented-reality-sdks
Lots of SDK’s for Augmented Reality app development. Try to find what fits your needs best, if you think Unity3D+Vuforia just not enough.
woooow so great thnx admin
You’re welcome
CAN you give me a way to do this: after track money paper and get alot of papers , when I zoom in the virtual papers I want to replace it with another image
…
thank u
I can’t run it, it says “The field PConstants.OPENGL is deprecated”.
Could you please hemp me make it work?
Hi. It’s really hard to tell me what is the problem if you did everything according to the shown instructions, did you tried google the error?
P.s. I hope you used older version drivers provided in this website not the newer ones?
I made it work, it runs perfectly now 🙂
The problem was that I installed Processing 3 instead of Processing 2.
That’s great!
[…] Don’t forget to subscribe as more cool tutorials awaits you! More information on this tutorial: http://www.himix.lt/augmented-reality/augmented-reality-and-leap-motion/ […]
what verson of vuforia SDK you use?
Right now the newest – vuforia-unity-5-0-5.unitypackage (33.17 MB)
Very Good
Great tutorial…Thanks Please send complete info asap…
It’s already completed.
Hi, greetings from Rio de Janeiro, Brazil! First, I wanna thank you for all your tutorials, I am trying to learn more about Unity3D since I started to watch your videos. But I have this question: If I want to use my smartphone as a stereo glasses with augmented reality, does Vuforia generate an output app for this? Thanks once again and I hope your ideas help to transform our world into a better place!
Hello, Ricardo. Thank you for kind words.
Actually I don’t know the answer, I haven’t done anything alike so far. I mean I didn’t tried to use smartphone as AR glasses. But if you find some useful info on internet later on while researching, please let me know, I’m interested in everything related to AR.
Hi i have one query after taking screenshot this screen shot is not save into gallary.so is their any solution for this.
Yeah, it’s not in the gallery, but it’s somewhere in your device’s memory. If you know how to modify this code in order to send the pictures to the gallery, please, let us all know! So far I achieved this (saving pictures to gallery) only by using Unity3D assets/plugins, which comes with a price.
waiting for eager new tutorials. congratulations for the work done
Hi. I have downloaded Processing 3. When I run simpleLite the following error is shown in the console. “No library found for processing.video
Libraries must be installed in a folder named ‘libraries’ inside the ‘sketchbook’ folder.”
Any Solution
Hello, download 2.2.1 version of Processing (https://processing.org/download/?processing) and you should not have the following problem.
waiting for the new tutorial 35 very good job. Greetings
Thanks, currently I’m waiting for free time 🙂
Hi,
I don’t have any camera on my computer, athough I would like to use the camera of my mobilephone, is it possible to make the program search for IP address of a camera?
Cheers
Hello. Nothing is impossible, however, I never tried to do this. Trust me, you don’t need this additional problem solving.
Hi! Great work. Any chance for wikitude in October?
Hi, not yet.
i downloaded your project from this site
only one object is view at a time i want to run as in your video
i run app in android mobile any setting require to run app in multi object mode
I would suggest to start recognizing Kuriboh card firstly, after that blue eyes white dragon. Actually, Kuriboh card has not so many good tracking features. Before using the app on android did you test in Unity3D play mode? I would suggest you do it so and check it out how well multitracking is accomplished.
Hello Edgar, I like this sample. Actually great work with all of tutorials. Do you plan make some tutorials also for Google glass ? Thank you
Martin
Thanks! Actually I don’t plan to make a tutorial on google glass as I don’t have them. I have AR META glasses as I’ve seen more potential in it. Who knows what future will bring 🙂
hello, i been trying to making Cylinder Target based on your tutorial
and while i’m trying upload the SIDE image on vuforia target manager, it failed saying the image Euforia of Beauty Logo dont match the dimension, cant you post the tutorial how to measure the image so it can be uploaded on target manager
Hi, I’m not going to make a tutorial on this. I’m sure you will find your way out to upload the image in proper dimensions.
I’ve added rescaled image (Download # Print Euforia of Beauty Logo to Augment the Content and Create the Tracker for Cylindrical Object (*.jpg file)), you can try to upload once again.
you are so great!
hello, i tried to make rotation button on adroid
but its always looping
the only way to stop it, is hold the button
but if i release the button, its looping again
Most likely you did something differently than I showed in tutorial.
Thanks a ton for all these knowledge sharing and I have become a serious follower of your tutorials and has also subscribed your youtube channel. These tutorials are great assets for people like me who are getting in to AR field.
Thanks for kind words.
Hi I am totally new with AR and any software developer ( not a programmer at all), Thank you, I test it out and it works. But I have a question, If I want to use my own 360 picture, how can I upload it and use it?
I’m not sure what you mean “360 picture”, but it is shown in this video how to upload the image. Just change the Logo to your own picture.
Hi, Thank you for the tutorial, it is a really good kick start for someone who is totally new and want to learn like me. However, I have a question, after doing everything you did in the video, when pressing play, how do I like it to my android? Are there any videos you did which explains?
Hello,
What do you mean by “how do I like it to my android”.
Sorry, I meant link* it to my android. typo.
You just need to export it for Android OS like shown in this tutorial: http://www.himix.lt/augmented-reality/augmented-reality-android-app-export/
Hi,
Thank you for the tutorial video, it is a really kick start for someone totally new like me.
It would be great if you can help me with my question.
After doing everything you did in the video with a PC, and press play button, how do I link the program to my android device?
Is this marker-based or markerless-based augmented reality ?
14th Tutorial is markerless.
this one is marker-based example: http://www.himix.lt/augmented-reality/augmented-reality-marker-based/
I have been trying to run this code for hours but for some reason the Arduino 1de ver 1.6.5 on windows 10 cannot find HMC5883L.h. I’ve placed copies of the library in the main library, the sketch book library and the hardware library but it continues not to find it. I also used the library manager. help
Hi, I have windows 10 x64, I tested it right now and it works perfectly. Make sure that the library directories looks something similar to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_Example
not to this:
C:UsersEdgarasArtDocumentsArduinolibrariesHMC5883L_libraryHMC5883L_libraryHMC5883L_Example
Depending on how you extract the libraries, it might have two same folders one in another (HMC5883L_libraryHMC5883L_library\HMC5883L_Example) so I assume Arduino can’t recognize it.
it is possible to turn on a relay with a button on the html page?
Of course, the same way you would turn on LED. Easy as that.
Ok but where i can found a code
You can use this example:
http://www.himix.lt/arduino/arduino-and-ethernet-shield-for-home-automation-control/
hello..
how can i make the button when click on it, the model will duplicate/added and another button that will remove the model..
tq…
Hello, this is easily done, but I won’t start coding for you here. I suggest you google Unity3d C# code on that.
Hi, I don’t know why but arduino software don´t recognize the ADXL345 library. The program shows it in black instead of orange.
The folder is in this directory: C:UsersLeyreDocumentsArduinolibrariesADXL345_library with the other libraries (which it recognizes well) and the folder ADXL345_library is not duplied.
Nevertheless, when compiling the software doesn´t show any error.
Any idea? I need help please.
First you said it does not recognize the library, but then you say that it “doesn’t show any error”. I don’t understand. How do you know that it can’t recognize the library without the errors written?
Thank you for answering.
The programm shows the library and all the functions related to accelerometer in black instead of orange whereas the rest are in orange. Isn’t that weird?
Nevertheless it compiles and I can upload the programm to the Arduino UNO Board (although the data adquired is really weird).
That’s really weird, it’s hard to suggest something right now, but answer to this question: do you really use GY-85 board? yes or no?
This is important because I had some weird data readings while using board with only accelerometer alone and I couldn’t find the solution for that.
Maybe colors of the text have no importance. The fact is that I’m pretty new using Arduino, I’m still learning basic things.
Another question, how can I get angles between -180 and 180 degrees? I have confusing data in the serial monitor
Try and search specific library for GY-85 on the internet.
Hi again,
I have been trying the code the entire day but i don’t understand the outputs.
I want the angles in degrees. On the one hand i get the rolldeg and pitchdeg between 20 and -20 instead 90 and -90 degrees.
On the other hand when i show anglegx, anglegy and anglegz i get signals which change even when the Imu is stationary.
I would like to add to your code a complementary filter but with those output data i can’t.
I have watched the video and it seems to me that output data shown are correct but i don’t get the same.
Could you help me please? I’m a little desperate because my project depends on an good measurement of the angles.
Best regards and sorry about the mess
If you would send me some pictures of the sensor wiring to arduino microcontroller and screenshots of error in the program maybe I will be able to help you.
Hi Edger , thanks for this great Tutorials ,
it had been a great help ,
i’m just wondering why you don’t had Audio in this Tutorials speaking for what you do and explaining it provide mush more help
I have my reasons.
Can you please create a video tutorial on developing mixed realities using vuforia and unity3d?
Thanks in Advance
Can you show me an example of what you really expect on “mixed reality” tutorial?
Thanks for your reply, I really need a tutorial on how to use vuforia and cardboard SDK.
The app would scan the image target to track the AR world, and the user would be giving a button that when looked at will teleport the user into the VR world and Vice verser. just like the vuforia sample.
Thanks,
would be expecting the tutorial soon.
Great idea, but I wouldn’t expect tutorial soon.
I am having the same problem after checking it in error this variable :
ADXL345 adxl; //variable adxl is an instance of the ADXL345 library
Arduino_AccelerometerADXL345_Servos:32: error: ‘ADXL345’ does not name a type
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void setup()’:
Arduino_AccelerometerADXL345_Servos:40: error: ‘adxl’ was not declared in this scope
Arduino_AccelerometerADXL345_Servos.ino: In function ‘void loop()’:
Arduino_AccelerometerADXL345_Servos:46: error: ‘adxl’ was not declared in this scope
Can speak step by step how to resolve this error?
Check the path to the library. Wrong direction.
I could make it work.
thank you.
Hello sir
I want that sensor can you provide me link to get the sensor shock module. The sensor is very important for project of bachelor.
Thank you
You’ll find it on ebay with keywords “Shock-Knock Sensor KY-031”.
Hello! Student studying physical computing here. To run by “dropping in,” are there any constraints for dimensions/scale or file size for the OBJ model?
If your *.OBJ model size is huge, most likely, the model itself is really complex and has lots of vertices/polygons. I don’t know the exact constraints on this matter, it also depends on your computer specifications. You should sort this thing by experimenting with the models you have (if it’s really complex).
hello when I move the fast sensor, servo motor lock and does not follow the movement, and is making a noise (tec tec tec) in the servo motor, and then resetting it back to normal, you can say what can be ?
I’m using servomotor Towerpro MG946R direct 5v power by the Arduino.
Are you sure that sensor readings are correct?
as igniting 2 LED with applause and which lights decicir first?
Please, repeat the question, can’t understand it.
Hello! good day!I am a beginner and i need your help please help me. I downloaded ur code ,i have the adxl345 3-axis digital acceleromter and 2 Servo motors. i connected the ryt connections u’ve indicated above. But like other reviews say, there are some errors….what should i do sir? and where should i start…..i dont understand the library stuffs u mentioned..what library is it?where could i find it??and where to copy
Hi, from 0 to 12 seconds in the video, I’ve shown where to put the library folder. Do it so, and use the code. Good luck.
How do you make the application of augmented reality is compatible with Windows what programs are used
For starters, I didn’t make the app for this book, this is only a DEMO of what other developers did. I haven’t tried yet, but one of the users provided a link (Vuforia standalone) in order to make vuforia work on PC platform. However, I haven’t tried it yet and right now I can’t find the link it’s somewhere in one of the tutorials comments place.
Hi, I already made those examples but I have a question do you know about the Dragon Board 410c, I want to play mi app in those board but I installed the Ubuntu Linaro but my app doesn’t work with this … Do you know or do you export and play some of your apps with some other board ??
Sorry, but I don’t know, about the dragon board 410c I’ve heard only from you right now 🙂
HI
i will buy the meta dev kit1 next Time. I have a question to your work
with which Prgramm you make this videos ?
On the Videos the FOV is extrem Big ! Is this alone on the video or on the Meta to ? I have a bt-200 an this device have an smale FOV on the near distance.
Have you contact with META when appears dev kit 2.
I use Techsmith Snagit and Camtasia.
I suggest to compare the specifications of bt-200 and META glasses. I didn’t had the opportunity to use some other AR glasses (only META) so it’s hard to compare, but those who put those META glasses for the first time says that FOV is not so big. So, of course, watching video here, and using it for real is like day and night.
Can you re-specify the last question?
Actually META Glasses use moverio see-through display as their base display. So, the FOV will exactly same between both devices.
Actually META Glasses use moverio see-through display as their base display. So, the FOV will exactly same between both devices.
Hi ! Great tutorial but I have stuck on building part … I am doing same as you on movie but my apk is not working on android … it has black screen … I never developed for android and I belive I am missing something in setup. My projetct works great in unity preview, whole building process is ok to [no erroes] but after copying it on device and installing nothing going on. Can u point me in some direction. I need to run project on phone and all seems to be ok but this.
Thanks a lot!
It’s hard to say, but if it’s not “top secret” project send it to me and I’ll take a look. We can start from *.apk file. Maybe it’s a smartphone’s problem. Who knows. It’s also worth googling this problem.
Hi
Thank you so much… this is very useful to beginners like me.
I have a one question.
I tried this with two different object and tried on android device. The issue is my both objects are visualize from starting.. Buttons didnt work…
Did you put virtual buttons on textured tracking object? If not, I suggest to make it so. Don’t put virtual buttons on plane without any textures. I hope that helps.
i cant run file ,affter make like your video.
WEBCAM miss ???
http://www.uppic.com/uploads/14461898941.png
http://www.uppic.com/uploads/14461898942.png
Reinstall 64 bit Unity to 32 bit Unity.
Hallo,
thx for your answer.
I think Meta brings in the next Time a new Version of their glasses out. I think the name is DevKit 2.
Have you any information about this Release ?
THX
Oh, don’t know anything about it, I thought the next version will be for consumers.
Hi, I use Processing 2.2.1 and install . When I run simpleLite the following error is shown in the console like that(https://dl.dropboxusercontent.com/u/39808973/Screen%20Shot%202015-11-01%20at%207.52.58%20AM.png). Can you help me please?
Hi, I use Processing 2.2.1 and install nyar4psg 2.0.0 library. When I run simpleLite the following error is shown in the console like that(https://dl.dropboxusercontent.com/u/39808973/Screen%20Shot%202015-11-01%20at%207.52.58%20AM.png). Can you help me please?
If the library is in place, it’s really hard to say what else could be wrong here.
What is folder direction to your library?
Documents > Processing > libraries > nyar4psg
I can see that you’re not using Win OS so it’s hard for me to suggest something. Also, you’re using Processing 2.1.2 not 2.2.1 but I don’t think that this causes the problem.
Hi, thousand thanks for your code! i tried several not working but this works so perfectly!
Btw, I will need to interface with two sensor, I dont know how to change the code to measure for two, could you please help me please?
Thank you so much in advance..Have a great day!
Sincerely,
Caryn
I’m glad it works. The code is not complicated at all in order to add additional sensor, try to sort it out by yourself.
may I know what am I measuring in the code? is it vibration amplitude or vibration time ?
Amplitude
Thanks. can u tell me the amplitude SI unit?
No, I can’t. This is raw data readings from the sensor.
alright.thanks anyway
Hi,
i’m and Interactive Developer using Unity3d, i sincerely need your help on how to integrate Vuforia and Googlecardboard sdk in Unity 3d for Architectural Visualizations.
Great work with the tutorials.
Thanks in Advance
Hello, John. Haven’t done anything on that yet. But I will soon, you can wait for a tutorial (2 to 3 months), but if you want it to be quicker it won’t be for free.
Yeah!! already buttons are on Textured area..
But i received at the time of program run. It couldnt find the path of ICSharpCode.SharpZipLib.dll to Unzip the file. I couldn’t understood. I am not programming guy…
Can you help me?
Is Csharp is language supports Andorid?
Sorry it is difficult to me to understand script and do something on it..
C# is supported by Unity and using Unity we can export an app for Android devices (and other OS’s). I really haven’t stumbled upon this problem, can you copy-paste the directory to your project files. Have you tried anything simpler up until now? For instance: http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ I suggest you start from here and then move on.
After doing everything as in video in Unity 5.2.2 32 bit the unity chan model isn’t rendering in the video. What’s wrong?
I have to print the tracker before augmenting it. It can’t work with any sufrace?
Correct.
Can pls share d code?
Actually i needed the code with which u had drawn the rectangle around the maker and how to return the coordinates of corners of the marker.
The code is build in the library itself, so just start the example code and look for the code fragment related to marker corner coordinates.
Thank u very much ,just did not go through all d examples properly . Got it now .
How do apply log in account for META to get the tutorial?
Try here: https://www.getameta.com/ but I’m not sure whether you will be able to register if you haven’t bought a META glasses. Try it.
I think now it’s unavailable, they moved to META 2 quite a while, who knows maybe META 3 is on the way 🙂
Is dere anyway to add animated 3d models in this ???
Well, actually there is, but it would be more or less a “workaround” that I would not suggest to use. Now what do I mean? basically, you would need to export your model animation for every frame. So most likely you would end up having lots of *.obj files, where you would need to load these models frame by frame in void draw() part.
Option 2: search for a library that would be able to import animation file (*.fbx file extension). I couldn’t find for a better solution back in these days, but maybe it is available now. Who knows…some research is needed.
Option 3: Try doing AR tutorial No 14 that involves Unity3D and Vuforia, you can add animations there quite easily.
Thanx for your advice, actually i went through all your videos and the unity ones are quite cool . But actually i am newbie for unity and coding any logic into it is becoming difficult.
But i will try on the fbx part you told.
hi thanks for all tutorial ,
I made my ar apk unity+vuforia my mesh model is from my own sketch up models (import fbx to unity). when i try on pc is workin fine, but when i installed it on my smartphone it works but slow response and lag. any suggestion for size ar apk? (mine is 187kb)
next question… how to make quit button on app ..? thanks ..
Hi, I don’t think it has something to do with *.apk size, I would look into the complexity of your models or maybe you smartphone has some low specifications. Look here http://www.himix.lt/augmented-reality/augmented-reality-screenshot-and-sharing-on-facebook/ or search on google “quit button in unity”.
Please , I made exactly like the video but the Camera didn’t open and that’s appear
Any help ?!
http://www.uppic.com/uploads/14469334541.jpg
Use 32bit Unity.
Great!!!
Do you use Vuforia ? Do you need Network ? Tutorial ? :-)))
Yes, I used Vuforia. What do you mean by “do you need network?”, “tutorial?”?
hi,
how can I take another snapshot and not to replace the older one.
i want to create a snapshot gallery stored on SDcard.
thanks.
You will figure it out, I believe in you 🙂
please tell me . i can not figure it out..
one more.
now I’m develop ARVR app, is this snapshot button just to snap on-screen interface ?
is this work on stereoscopic interface ? i want it just a monoscopic(single) image saved.
*sorry for english
Everything is shown in the video? Doesn’t it?
network = 2 devices and 2 User’s Control one Character. I work on this but it is very heave to develop. The Syncronisaton must to be over Network.
Tut ? you make a Tutorial from this. Download Source ect.
I haven’t done anything on this yet and it’s not in my plan list to do it.
would be great if we could create a game based on that … could you make a tutorial? … I have many ideas to put into practice!
Maybe in a far future.
JoWeb can you do a tutorial on connecting via network ?
It’s not in my plan list.
But , can it run although my OS 64-bit ? and if it doesn’t ,there’s any other solution ?!
32bit Unity can run on 64bit OS.
Is there any way to run this on 64bit Unity?
I dont want to remove current unity and install the whole unity again ..
I hope there is, if you’ll find one – let us know.
is it possible to give us any info or hint about the making of this demo or can you please mention any reference can we used to learn scripting in unity and achieve same results
This tutorial http://www.himix.lt/augmented-reality/augmented-reality-fusion-effects-using-multitarget-tracking/
is the closest to this demo, I would start figuring out how this works.
Is there a way to paint to textures on walls/ buildings as a user moves his camera around a street anywhere.
If there is a way, I don’t know how to achieve it.
yes I know of course you using distance script……thanks……but the main part it’s how to modify the distance script parameters or how to write a new script me by using a playmaker or something else
what is the best way to learn to script in unity
go to unity3d website and watch tutorials.
Hey admin do you have any example which runs on Matlab??
Nope
I have found all of your tutorials very helpful. Great work.
Hello,
Is it possible to turn the buttons into virtual ones that can be pressed with your hand like you demonstrated in a another tutorial (No. 19)? I don’t succeed when I try to merge the code UI from this tutorial and the one of Virtual(Vuforia) buttons…
You already answered to your question.
I thought it’s just because I’m a lame coder… Is there a way to extend interactions with virtual buttons (Vuforia-Unity)? For instance, jumping to a next scene or playing a video?
Btw, thanks for all your tutorials, they are very helpful.
Yes, there is a way. The same way I switch models in this tutorial (http://www.himix.lt/augmented-reality/augmented-reality-virtual-buttons/) you can add different functions – load another scene and so on. Just dive in the code “VirtualButtonEventHandler.cs” (starts from case “btnLeft”: and case “btnRight”:)
I will try until I make it, thank you very much!
There’s one more issue no one could answer at vuforia’s forum: is it possible to trigger a whole environments in which the viewer can dive?
Should I use extended tracking (the triggered image would be much bigger than the marker) and keep the image target active even when tracking is lost (can I just disable this function to keep it on even if I turn the device/camera in other direction)?
I’ve read about different plug-ins like Unified Coordinate System that could help building augmented environments… Could you point a direction I should go?
Cheers!
It is possible, but I haven’t done this in Unity. Basically what you need I did it here with MARG sensor (http://www.himix.lt/arduino/arduino-and-virtual-room-using-mpu-9150-marg/), just with pictures and without tracking any image target. Same stuff applies to smartphones and tablets. I haven’t heard anything about the plugin you mentioned.
hi.. how can i connect it to the ethernet shield?? and what could i established for an ouput device using flame sensor?
http://www.himix.lt/arduino/arduino-and-ethernet-shield-for-home-monitoring-over-internet/
http://www.himix.lt/arduino/arduino-and-ethernet-shield-for-home-automation-control/
are you serious.
On assesst store, leap motion core assest’s version is 2.3.0. But you give a 2.3.1’s link?
I am serious, currently there’s probably even a newer version. This asset was downloaded not from unity asset store, but from leap motion companies website.
Sorry, man. Forgive my offense.
You are right. assest store’s core assest is old. And there is indeed newer version on leap motion website.
Haii .. Thank you so much for this awesome tutorial . By the way, can a particle system be controlled by our 3d object instead of using arduino ? For example, when i click 3d object such as factory, the particle system for an example, smog will emitted.. im trying to make an interaction with my AR project. Really hope you can help me, tho. thank you.
Yes, this is possible. But you will have to sort this thing by yourself.
Oh I guess so too. But I don’t think I can make it by myself because I’m soo bad in coding etc.. Anyway thank you for replaying.
Just wanna say you are awesome!! Thank you very much for all of tutorials
Thanks.
Hi, first of all thank you for all of your tutorials.
could you use cursor highlighted tool in your future tutorials?
Can you suggest any, maybe that you use?
[…] http://www.himix.lt/augmented-reality/augmented-reality-user-interface-unity3d/ […]
[…] Basic tut (1st unity #14) http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ […]
[…] http://www.himix.lt/augmented-reality/augmented-reality-user-interface-unity3d/ […]
can i implement the app on tablet connected to external camera ????
and how you track each part of your body???? can Kinect distinguish each part and give it a tag?
thank you
waiting for generous reply
I don’t know whether external camera can be connected to the tablet, I haven’t tried to do so.
And yes, Kinect can distinguish different parts of your body, I mean track your body parts/joints, its position and orientation.
embt..
can you help me ?
why my app won’t show my 3D model when i run it
thanks
Hello, I would re-watch closely the video. Most likely you forgot to put your model in ImageTarget? Maybe some errors shows up?
still can’t respon
maybe you can help me with my project ?
Maybe, who knows 🙂
hola soy de peru . he vist o los tutoriales de unity3d sitio web y los scripts no son correctos .. gracias admin por la ayuda que nos brindas en tus tutoriales por que los ejemplos de unity web sites no me resultan nada
Hello, could you write it in English?
“hello i’m from peru. i visit the tutorials of unity3d on the website and the scripts is not the right… thanks admin for the help that you gave to us on your tutorials because the unity’s examples in the website not help me.”
-Something like that!, sorry for my bad english too!
So you’re saying my scripts are better than Unity3D scripts? :)))
sir can i ask about the details of this project? i need some help for my school project..i dont what to buy and how much it will cost..
It’s everything on the website, nothing more.
I’m amazed, I have to admit. Rarely do I encounter a blog
that’s both educative and entertaining, and without a doubt, you’ve hit the nail on the head.
The issue is something which too few men and women are speaking intelligently
about. I’m very happy that I came across this in my hunt for something regarding this.
Thanks
Hi, I have followed your tutorial and everything works fine in play mode but I cant build project on android.
Worth to mention that I configured everything like you show on money tutorial and that project works fine … can u help in this matter or is text rec not working on android ?
Tkanks in advance.
Bart
Hi, text recognition is working, but video display is programmed not i a proper way that is needed for Android devices.
it’s possible to add voice recognition to this Character
I believe everything is possible. Have I done it? not yet.
sir thankyou for helping me can you please tell where will the led be put
-regards meheer shukla
it’s all in the code:
#define redLed 7
#define greenLed 6
#define blueLed 5
May I ask, what if there is two model in 1 scene? Because your tutorial only have 1 model in 1 scene. Then, what happened to the tag? Can I tag both of my model as Model?
Sure, why not, if you want to apply the same function.
I have tried set both character same tag name as Model, but it does not work.
What happen is only 1 character that scale up and scale down when I click the button, the other one did not happen anything.
I’m creating a scene that has two character, one person performing CPR and the other person is the patient. I need both character to scale up and scale down at the same time when the button is clicked.
Firstly try it out on 2 simple cubes (tag it), try to scale it and tell me the result.
I tried too make much object but only 1 object can rotate, scale etc. Please help me.
Hello, Your work is so amazing! Can you provide a copy of the source code for me?
At the moment it is not shareable.
Admin you my teacher and motivator…. i will donate you for sure….
Thanks for good will.
can you help with tutorial or any information about: User Defined Target(vuforia). ? thanks for advice
Soon it will be available.
Can I use External Web Cam???
If Yes then How??
For tracking Human body? No, not in this case.
I tried your Money augmented tutorial no:28!! that was amazing!!
How can i use user interface in android app?
The same way as in this tutorial.
may i know, where does the 3d object skull and iron man come from… can we call another object to be augment on top of the head?
https://docs.google.com/uc?authuser=0&id=0BygvzTqnzm_wTXV2UHlWN2NMXzg&export=download extract, data folder and yes you can.
my camera not start kinect install properly but dose not start after play it`s totally blank hot to solve this problem
I would start from the begging. Firstly test it out whether your Kinect is working properly on PC? Maybe some other sample codes/apps?
It is cool. Great work.
how to create a 3d object of my choice?
How to create or how to use already created models?
You can create models using 3Ds Max, Maya, Blender, SolidWorks and lots of other 3D modelling tools.
How to use it? you should put the model in data folder and change some Processing code (you will find out if you watch closely).
unity collider trigger
Nope, actually no colliders used here.
Thanks for the tutorial, you could explain how the process of extracting the 3d models of monsters yugioh game?
I haven’t extracted models from the yu-gi-oh game.
How do I reset the animation if the target has been detected? It seems that the Animation just pause/continue playing if The target has not been detected.
Yeah, for this you will need to code a little bit, just google how to stop animation in Unity3D.
android ultimate plugin lite seems not free but paid, you got a free asset android ultimate plugin lite for unity3d thanks
Ok, is there any question hidden to what you wrote?
A big big big thanks
hi, great tutorial, so thank you, you help me a lot learning AR, can you help me with one question when i installed it says ok the moneyar.apk but when i open it, the screen get black, and nothing happend 🙁
ok thanks!! Can u tell me from where did you get the iron chest model?
On the wide internet, my friend, google it, can’t remember the exact website.
hi, the last questions, i get resolved, it was my android version, i test with another one and now it is working. 🙂
Another excellent tutorial, android ultimate plugin now cost 5$ 🙁
Oh well, it’s not so much, but maybe you’ll find something for free, but you’ll need to modify some code.
Great Tutorial thanks, but i want to ask question i developing a vuforia app,
but when i taking a screenshot using your code my result is only white screen and the Augmented view only with white background can you help me pls?
sorry for my bad english, thanks
I have a better “taking a picture” function, but I am not willing to share right now.
I cannot find the file [patt.hiro]
I want to change the marker file.
Do you know where the file is?
in nyar4psg library folder
Hi, I did the same way as u did and everything worked. Just that the UI buttons will be on screen even when the image is not tracked. The UI Buttons would just stay at the last place it was tracked. I wanted to do when the image is tracked the buttons appear where the should and when is not tracked it would disappear. Please help. Thanks in advance.
Hello. Put your Canvas (with buttons) inside Image Target.
Hi first thank you for your job ! sure i have the same problem i did everything but the UI buttons are on the screen doesnt matter, if canvas are in or out ImageTarget the result has same the ui button are always on screen.
Please help. Thanks you.
I have the same problem as Nqb..I have tried making canvas as child of Image Target. But still canvas is rendered onto the screen even if image is not tracked. Can someone please help me out?..Thanks in advance
Best tutorial..
Can you create tutorial about interactivity hand kinect with button like virtual dressing room?
I’m not planning to
Hi, i had printed the marker but its not working, can you please tell the dimension of your marker?
great videos btw
8×8 cm, thanks!
thank you soooooo much
how to use this with arduino uno
The same way as nano, just connect to a proper pins, which is also written in the code itself.
I have a warning!!!
WARNING: Category ” in library UIPEthernet is not valid. Setting to ‘Uncategorized’
what’s your full path to the library?
thanks for this tutorials , But when i trying to copy your CS file to the Unity , this error appear , did you know why ?
http://www.uppic.com/uploads/14531513431.jpg
I would guess that plugin was imported wrong, unless you downloaded whole project folder?
Hey could you suggest any tutorials for using real world marker input instead of virtual buttons? The plan is to make an application which reads the position of a real world marker and responds based on it’s hovering over a real world button which is printed on the paper page instead of virtual buttons.
I don’t have such tutorial in my list. But I would know how to code it.
can i use dress for body to that iron man ?
Why not?
if i place several game objects on 1 marker, can i scale/rotate/move them individually?
You can.
Can you please tell how should i move them individually???
– Please guide!!!!!!!!
Can you please tell me where I can find the full algorithm for the code?
Have you watched the tutorial from start to end? It’s all there.
why am I getting this error
an error occurred while trying to enable vuforia play mode
I believe it was one time error? no? and happens from time to time?
hi. i am from turkey. please help 🙁 i need this code with lcd . how can write 🙁
So try to combine it, there is also a lcd tutorial.
Can you make a video tut for play video in place of animation. like if we the target located just play video.
I already have the content gathered I only need to film it. I’m planning to do so on February and I will upload it on youtube.
I got same problem with Nqb, and i already put the canvas with buttons inside image target, but the result still same, the button still popped out when the marker is lost
Can you get this to work with an Arduino Uno or Mega by any chance, an what changes would need to be made to the code.
Whats that device youre using?
Leap Motion Controller
hello, thx for the tutorial its helpful 🙂
as far as i understand the primary surface used to track the scene which could be as the size of a dinning table, what if there is 5 image targets on 5 different places ( not far away), would that extend the size?
example: lets say i did put 4 image targets at the edges of a table, would they all be tracked at the same time and the “props” would be the same for both? , or each target would define its own scene and props?
hopefully u can understand what i mean 😀
This is something that you will have to test on your own, but I would say each target would have separate “props” not the same. This is my logical guess.
why when i import on smartphone the canvas doesn’t appear?
On PC works everything perfectly?
yes man, on pc i can see the button and the panel, on phone it seems like they’re invisibile, when i tap randomly on the screen tapping them, they work but i can’t see them!
Hard to tell, I would need to look into project
how can i send you? mail?
Sorry, I won’t have time to deal with it.
Hi, I downloaded your project, and when i exported it for android I can’t see the button on smartphone, but when i download your apk,works, what’s the problem?
I have restested just right now (exported the apk). Works great actually.
How I can detect white color?
why the values are not same each time though color object is same from same distance?
Hi! Thanks in advance for your work! I tested and worked very well.
I’m looking at the code and I have some stupid question to ask you. I would be glad your feedback. (sorry in advance for the dumb questions.. but I’m not an expert in arduino)
0.1- the raw output of the acc is what? voltages?(from readAccel)
0.2-the raw output of the MM is what?
0.3-the raw output of the gy is what?
1-line47: reading gyros acc and mm, you have a FOR loop of 201. May I ask you why?
2-line88: why 255? there is a pre-set offset of 255deg?
3-line89: why are you dividing by 256 the raw data?
4-line92->95: I don’t understand what u are doing here.
5-line113->115: why is divided by 14.375?
I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.
Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?
Have you ever tried to implement it on arduino uno?
I had rumors that is impossible due to limited memory?
Thanks in advance for your kind answers,
I really appreciated your wonderful job! It works nice?
p.s. do you have an oscilloscope for dumb Macosx users?
Cheers
S
Hello,
1. Concerning all first questions – look up some theory on the internet how it works and read sensor datasheets it will answer lots of your questions.
2. “I was looking at you comment in the 15-05-2015. It is interesting: Kalman filter doesnt improve the attitude determination with quaternions. may I disagree? the kalman filter (if well tuned) would improve drastically the representation of the attitude during time, enabling what is so called “smooth” representation.” – show me some proofs of that “drastical” improvement. Quaternions already have smooth representation.
3. “Here (http://www.himix.lt/?p=915) are you using just quaternions? no KF? right?” – correct
4. “Have you ever tried to implement it on arduino uno?” – Yes I have tried it.
5. “I had rumors that is impossible due to limited memory?” – Wrong, there’s enough memory.
6. No, I don’t have oscilloscope for Mac.
Good luck with your work
Stu figlje e bucchin. thanks 🙂
Sorry?
I am ready to buy… please how can I get code for MF522 and NC door lock latch and Adriano? send details to my email address.
I’m not selling anything.
I did everything perfectly but when I rotate the object, it does not rotate on the Y axis but it makes a combination that continuously sends it down making it impossible to orient. Why? How can I fix it?
Help me please! I’ve already tried some files for processing on the kinect, like this one: https://github.com/shiffman/OpenKinect-for-Processing/blob/master/OpenKinect-Processing/examples/Kinect_v1/RGBDepthTest/RGBDepthTest.pde using the model 1414 and they worked perfectly. But for this project i have installed the libraries and run the processing file and it just opens and freezes in the gray window, without showing any real time images. It doesn’t seem to show any errors. What do you think it is?
I know this might a stupid question but I noticed when Opening your Project that I downloaded In this site unity immediately open the “Game” tab and the “Scene” tab is missing I was wandering on how did you do that. Thank you so much on this tutorial reaLly learned a Lot on this experience.
This was not my purpose 🙂
Hello, it’s greate. Can i bay some controllers?
Hi. It’s not for sale yet.
when i try to start the camera is freezing. any suggestion? i am using notebook.
Without any errors?
Thank you for your work, your tutorials are very helpful!
I was wondering if it could be possible to show the panel only when the marker is tracked?
It is really possible, but I won’t go into details, google unity/vuforia forums.
Thanks, I’ve managed to find a solution on these forums! But I’m now facing another issue as I have multiple targets. It works great, until I click the camera button and track another target : the share button from the previous track still appear… Is there a way to restart/disable your script while on OnTrackingLost?
I ended up duplicating your script and calling the matching canvas for each ImageTarget. I don’t know if it’s the best thing to do but it is working! Sorry for the bother and thanks again for your tutorials!
Hi admin!
After tests on several devices, I am facing few troubles on a tablet using Android 4.4.2 :
– If I take a screenshot of an ImageTarget and share it right away, my app restart.
– If I take a screenshot of ImageTarget A without sharing it afterwards and take another one of ImageTarget B right after my app close.
It is working great on smartphones using Android 4.2.2 and 5.1.1 though, any idea what the problem would be?
Hi, I know about the restart issue and I am not sure why does the sharing function causes it.
Hi,
I downloaded the script file and loaded it directly in my scene. It didn’t work. Both the buttons and models appear as soon as i enter the play mode.
Later i also tried changing the names of buttons and models according to what I have named them on my scene. It now shows shows me an error to fix the compiler.
Could you please guide what all attributes are to be changed before loading the script.
Thanks.
Everything is shown in the video. You can download whole project file and test it out first.
Hi, your tutorials are very helpful great job! Got one question is it possible to take a snapshot with interface graphic elements? In my case the snapshot is working but without augmented layer. I’m working on simple app with OpenCV ForUnity. Maybe I have to change the camera name in your script?
Thank you and pleas keep your tutorials coming!
It’s possible with or without, doesn’t matter. Just dive into code, I’m hiding the UI elements.
Hi,
I run this code with GY-85 BMP085 sensor. ADXL345 library is in black instead of orange in the program. Also, at the serial monitor, the values does not change and they are always “0.00, 0.00, 92.50”. I don’t understand why it is. I need help 🙁
Did you sort the problem? Maybe some wiring problem?
I actually know what your problem is. On my code, ADXL345.h is also black, but it runs just fine. I get the same problem if I move the libraries to a wrong location.
So to explain, When my code works correctly, I have my main folder labeled “Arduino”. Within that, I have a unique folder for each “.ino” file, labeled the same way as the “.ino” file (minus the .ino) and a folder labeled “libraries”. All of the libraries go in the “libraries” folder, then are saved in another folder titled with the name of that library followed by “_library”. For example, it goes:
Arduino>libraries>ADXL345_library>”all contents of that library”
I have the problem where my serial monitor values are always “0.00, 0.00, 92.48” if I move the libraries from the “ADXL345_library” folder to the “libraries” folder.
I don’t know if that actually makes a difference but if it was the same problem so hopefully this helps you fix it!
Can you please tell me how to remove the axes while showing the output??
Do you need it anymore?
Your tutorials are great!
I learned so much from them. I watched almost all your AR tutorials and executed all the projects!
I had a lot of fun watching and learning.
Thanks a lot and keep up the good work.
Please post more tutorials 🙂
Best thanks – donation :))
Hi, No voice?
Hi, no, my throat hurts, it hurts in every single tutorial (rofl).
[…] http://www.himix.lt/augmented-reality/ (lots of AR videos here, not just Vuforia/Unity) […]
[…] http://www.himix.lt/augmented-reality/augmented-reality-using-unity3d-vuforia/ (first of several videos on Vuforia and Unity 3D) […]
good!
Hello Edgar! Great tutorials!
Have you tried to play with physics?
I’m trying to roll a ball when tilting the marker, but have no success 🙁
Maybe you can suggest something?
Thanks!
A little bit, but I just suggest you surf the web for these things with Unity physics.
can we have the code and which sensor is used?
I can sell it for you.
It’s top secret.
It’s an awesome tutorial,but i was undering if it is posible to add adition text to user’s one, for example: he/she wrote: I like this games,and in the end an stabil #CompanyName?
Hallo dear
can I know how I can reflect the car on that paper is there any device doing that please
thank you
Please follow the Augmented Reality Tutorial 14 and instead of character import car from the asset store. Devices: Smartphones/Tablets/PC’s.
can I know how to lock the rotation to only rotate in z position? any tips for this?
I believe in you. I believe you will find a way by yourself.
i do not know coding very well. so i really dono where to change the code. can u tell me?
does the video when we scan has backsound?
What do you mean by backsound?
Thanks! I was searching for a tutorial like this!
Only after this half year, a lot has been changed to unity and Vuforia. I can’t follow your video any more. So sad.
Actually nothing has changed, except newer versions of Unity and Vuforia. The steps to achieve AR example stays the same.
Video with a new marker can not be played. The video with the marker provided can be played. Do I need to upgrade to Unity Pro to play the video? Thanks in advance. 🙂
No need for Unity3D Pro license, really, it’s possible to make it on your own target just some attention is needed. Did you follow my steps?
Can the marker be anything of my own choice ? Like any image I want ?
Yes, if you start follow from Augmented Reality Tutorial No. 14.
but thats using unity3d and vuforia. So does nyar4psg has this limitation of predefined marker file ?
With nyar4psg you will be able to track only square black markers, of course, you can make it your own, but nothing alike images. Such marker-based tracking won’t be so robust.
I did everything step by step but my videos wont play.
It shows up but it gives me the x image and when i click on it it gives me the loading image forever.
How do i get my video to actuall play?
actually play,
sorry
ok
Do you have any idea whats wrong?
is there a way to do a multi-3D-object tracking?
I would say no, it’s too hard to track even one 3D object.
Where did you get the iron man skin?
Googled it out.
Hello, is this project deployed on desktop version (windows or mac)? if not, do you have any idea how could be this possible? Thanks.
I haven’t deployed it anywhere, just tested it out in Unity Play Mode. I’m not sure whether some workaround would work on this project.
Hey i tried doing the same…. but my Unity Crashed when i am trying to add ImageTarget. I am using Unity-5.3.3f personal. Can You tell the version of unity you using. So i can follow your video’s.
Unity 5.2.4f 32 bit, Vuforia newest version.
thanks for your tutorials
but i want to ask you…
is there any way or tutorials for creating 3D objects or coverting 2D images into 3D ?
Of course there are, but I’m not making such. Example: http://www.123dapp.com/create
Hi
I’m studying in IT major, Could i see your code please? this VDO
https://www.youtube.com/watch?v=DXLyBQTS5-w
Help me please
Hi,
What do you mean “could i see your code please”? It’s not my video.
i want to know this, could you teach me please!!
No.
Why my video and image appears inverted ? How can I fix that ?
Struggling with this also, changing orientation by – scale does not work.
The video in preview is correct, but when playing is inverted D:
Aha fixed it, select the “Video” prefab, change Transform: Scale from +(number) to -(number), it inverts the image.
Example:
Scale X 0.1 Y 0.1 Z -0.1 <—
Doesn’t work for me. I tried -0.1 and 0.1 but nothing change.
try to invert X axis X 0.1 Y -0.1 Z 0.1 (it worked for me)
Sorry, it is
X -0.1 Y 0.1 Z 0.1
My bad i forgot to actually export it into a android project
Now it works
For any one reading the comments
THE VIDEO WON’T PLAY IN THE EDITOR YOU HAVE TO EXPORT IT OR IT WONT’ WORK
.fbx HELP
my video isnt that long, how can I change the script so it loops?
I’ll leave it to you to solve it.
I just want to say you are a beast, really appreciate the effort you put in.
Hi, is mpu6050 same as itg3200 because when I seperately search them, i see the same pictures.
MPU is like 2 sensors in one (accelerometer and gyroscope), but I would say not the same.
I quite like reading through a post that can make men and women think.
Also, thanks for allowing for me to comment!
sir can u change the tracker with user defined?
Hai Admin,,,
I try processing in Ubuntu but I don’t know how to import library Nyar4PSG,, I try to create and copy in ~/Documents/Processing/libraries but it doesn’t work correctly
I’m not sure about the Ubuntu… Processing version 2.2.1?
Yes , I try processing version 2.2.1 and 3.0.2
Use 2.2.1. What errors do you receive?
Thank you for the tutorial! I am working on a app that is going to be using Text Reco and Cloud Reco. I have a couple questions that I am hoping you could answer. For starters When I run it on unity I the space that can actually read the text is really small and not that forgiving whenever I move the text. I was wondering if you knew of a way to make where it reads the text larger/ more forgiving when the target/Phone moves? Also I was wondering If you knew anything about cloud recognition, I tried using the vuforia tutorials but they are out of date and no longer work and the newest tutorial I can’t seem to figure out either. I’m assuming i’m messing up in some sort of way because when I look online nobody else seems to struggle. Any input would help! especially with the cloud Recognition if you can, thanks!
Don’t know about the text stuff, but I will make cloud recognition tutorial pretty soon.
Awesome! How soon should I be looking for the tutorial?
can you please post the code for MPU6050. I stuck with my project and need some help
can we use webcam instead of kinect?
No.
Wow…Very Nice tutorial..Thanks
Yeah, sure, your welcome and my paypal account is… :))))
Hi,
PLEASE HELP ME…
I am facing issues creating virtual buttons. I’ve followed this video. I am mentioning one video below to illustrate my issues..
Link : https://www.youtube.com/watch?v=mxD0PiQ28_o
Note:
1. Virtual buttons is not working unless i am focus on to the button.
2. With out touching button it changing model base on my camera movements .
3. I changed max simultaneous tracked images 1 to 4 (each separate build in my Android mobile) .
4. Virtual buttons Sensitivity setting also changed from HIGH to LOW (each separate build in my Android mobile).
If you want i will send my Unity package file link also.
Thanks
ESWAR
Email: eswarkumar.borra@gmail.com
Hi. Try to put virtual buttons on textured place on the image target not on white spaces.
still not working….:(
Hi,
What steps need to be perform if we want a video playback on cylinder target ? I want to see a video, on cylinder like object instead of flat image marker.
Thank you,
Nik.
Combine the current project and this one: http://www.ourtechart.com/augmented-reality/augmented-reality-video-playback/
I have already achieved to display a video on Image target.
In Image target case, we upload our marker image to developer portal database, but for this case assume that image marker is a sticker which is attached to bottle. I want to see video as I scan the sticker.
So, shall I upload that image target as a cylinder target image in developer portal database. ?
And what would be hierarchy inside unity project ?
In case of video playback on target image:
– ImageTargetStones (Parent) contains ImageTargetBehaviour.cs
– Video (Child of ImageTargetStones) contains VideoPlaybackBehaviour.cs
What would be hierarchy for diaplaying video on cylinder ?
Works well. Thanks so much
Hi !
I did everything like you with Unity 32bit but when I click on start and show the target in front of my webcam, the 3D model doesn’t appear in AR..
Can you help me ?
Thank you !
I would guess that you didn’t checked all the needed checkbox’es in ARCamera or didn’t selected tracker image in ImageTarget.
Thank you !!! 🙂
I did everything as per tutorial. However, after pressing play my pc shows a black screen instead of the webcam display
Unity 32 bit?
No 64 bit. Will try 32 bit thanks.
hello,
thank you for this tutorial.
i try to this video, but i have a problem.
Assets/script/SnapshotShare.cs(7,17): error CS0246: The type or namespace name `AndroidUltimatePluginController’ could not be found. Are you missing a using directive or an assembly reference?
this error appear , did you know why ?
Have you installed plugin?
of course! i can’t find ‘class AndroidUltimatePluginController’
Should i change class name? where?
did you bought it?
hi.. i just bought the plug in. But smhow i faced the same problem with sh. (i am new in this)
The same problem here.
Hey.
nice tutorials. really helped.
one doubt though. What is the basic difference between markerless and marker based AR. I tried searching it but I’m still confused. In this case if we are adding the image beforehand then how is this markerless AR?
would really help if you solve my doubt
In marker-based tracking we track only black square markers. In markerless solutions we can track image targets, faces, hands, fingers, finger alike objects, bodies, etc.
Hi is Art tool kit support to build a exe in unity 3D
I haven’t used Artoolkit.
Great…I need this tutorial thanks a lot.
Can you give more tutorial to rotate the car with button left and right ?
i already make the udt but i can’t rotate the object.
Thanks :)))
it’s done here: https://www.ourtechart.com/augmented-reality/augmented-reality-user-interface-unity3d/
thankyou…
i tried to make it, but it’s always looping when i press the button.
is there any fix for this ?
uncheck loop box in AudioSource inspector menu
thank you, you’re awesome :))))
Hey i downloaded ur APK for testing it and when i click on the video, it loads for ever, any idea of what im doing wrong or anything?
Please rebuild the apk file using provided project files. It should work.
hello sir, this tutorial is very nice..
but i want ask the c# code in appcontent..
did you has update the new c# code in appcontent
I didn’t need to. Why do you ask?
Hi, I still don’t know how to install distributed library into the program by using nyar4psg, I have google a lot but I couldn’t find out any thing, could you show me how?, thanks.
When i click the play button my webcam not open. what should i do ?
What error do you receive?
How to play video continuously when i am taking camera focus from target. Thanks in advance.
Hi Kiran,
Did you find out how to play video continuously when target is moved out of camera’s focus? We would like ideally to use Vuforia to only trigger video player, so it comes out of the image and turning/moving towards the screen finally getting into the place. Once it’s in the place we can touch play button for video to play in the full screen mode. Also would be nice to close finished video and return to the targeting mode to trigger another video from different image. Any help would be greatly appreciated. TIA.
Only video preview in full screen mode would not depend on the tracking state.
About other needs – there is no easy description how to do so, you just need to code, but I don’t think you’ll be able to have some additional buttons (from your side) when the video is in full screen.
Thank you, will try it.
why cant i find the appmanager??
Hi Edgaras art,
thanks a lot for the tutorial. I’m having the same webcam problem, where it seems like you need 32 bit version, on the latest unity version there is no 32 bit version what I could do?
Get it from here: https://unity3d.com/get-unity/download/archive
Hi, I followed your tutorial and worked perfect! than you so much for providing this kind of knowledge, I really appreciate it.
What I would like to ask you, is how can I get the resulted .apk being as light as possible? Could you give me some tips on that?
I assume you only use videos in the app. So you could stream it from cloud.
How can I do that? (Streaming from cloud.)
Hi … wonderful tutorial. However, I followed all of the steps you explained but the camera can not detect the 3D object even on the textured sheet. Can you please help me with that?
Hey, i have the same issue. I followed the guide and implemented the app on an nexus 7, but the object can not be detected. I did not do any modifications on the code, so i don’t get it why it is not working.
One of Hello NEED presetacion like this https://www.youtube.com/watch?v=Kb99oD7FVgA as is the value
What exactly do you need?
We need that same application with the same animals
hi, i followed your tutorial, and i couldn’t find AppManger.cs and SceneViewManager.cs in /Asset of Unity.
could you tell me how and where can i find it.
Nice tutorial.but..
Can I. Request AR FPS tutorial??
What’s FPS?
I know this is late but I think he means like in the show where the monsters are in huge scale and standing right infront of the player type of thing.
J espère que ce projet sera possible sur smartphone et vous créez tout les monstre de yu gi oh car sur l application ( androdisc ) il a été seulement 60 monstre et j espère de savoir comment je peux construire ce demo et merci
Can you write it in English? Thanks.
Hey please can you tell be how can i put my own video instead of the video in appcontent file. Or how do you convert the video to its meta files.
The same way I did in the video, you don’t need to convert to meta files.
So basically you mean i just have to put my video in the appcontent folder instead of your augmented_reality_technology?
By doing this can my video play instead of the video provided by you?
Pretty much, don’t forget to add file extension *.m4v.
I tried it but when i play it through mobile the moment i click on the screen to play the screen goes black…any specific mistake that i am doing ? can you please tell me?
I have tried your tutorial but when I moved the object away from the camera, the interface stays on the screen but in angled position. How do you fixed this? Is this something to do with the script? Please let me know, thanks.
Same here! Would love to know how to fix it. So It can only pop up when you point towards the track. Already tried putting the Canvas inside Image Track, and it does not work. Thanks for the tutos!
thank you for your tuorial, it’s very helpful
i followed your tutorial, but i i’ve an error like this :
Error attempting to SetHeadsetPresent
UnityEngine.Debug:LogError(Object)
Vuforia.VuforiaAbstractBehaviour:SetHeadsetPresent(String)
Vuforia.VuforiaAbstractBehaviour:Start()
how to fix it ?
Hello , i download the source code and tried running it on my android device . However , onlly 2d ground image is being displayed on tageting at image target . Any clue why that might be happening . Please help soon as possible . Thanks
Hi Edgatas Art,
First of all thanks for the tutorial series. I have a question that in this case I think that although we are moving the tracking image by our hands but it remains stationary in Unity scene and it seems that the AR Camera is moving.
What I want is as I move the tracking image by my hands. I want the 3D object placed on the image to move along with it in the 3d space.
Please help me to figure it out
I really don’t get it what you want to do, as it already does what you just described.
Can I have the code and the name of the sensor too?
I have MPU-9255 sensor
where I can get this application
Hey man, i’ve tried this tutorial and it works, but now i got a problem, the warning says “trackable userdefine lost” and the object doesn’t show up when i click the button.
can you tell me how to fix this.
Thanks
Hello Edgaras,
thank you very much for your tutorials.
I tried this with my own video and it works perfectly.
I also changed the orientation of the video by Selecting VIDEO in Hierarchy and changing the X Scale value from 0.1 to -0.1
I have a problem when I pause the video and play it again: the music start from beginning but the video remain blocked.
Where is the problem? Maybe because I stream an MP4 video instead of M4V?
Thank you very much
Marco
I am Getting this error
The type or namespace name `MovieTexture’ could not be found. Are you missing a using directive or an assembly reference?
Same as here. Im basically new, so I dont understand how the script works. Though when I try playing, it works but, there that error. I cant build it.
cool
Are you using an standalone app for windows? so, can you tell me how did you do it?
Thanks.
No I don’t.
as I export the application to be displayed on the big screen
hey man I left comment on the old website too
I wanna know if this going to work on Uno ?
Uno?
I meant Arduino Uno. I can see that you used Arduino Nano.
It will work the same way.
awesome! how can I put request in that code ?
you can make much magic for game. we can study on your website. hehe
Study my friend, study.. And when one of your apps will make millions, don’t forget to drop a few bucks on my name 😉
I am Chinese in china, Use VPN to browse your site O(∩_∩)O Thanks
No problem.
hey man should I use to old website for request ?
The next step should be fired bullets
Haha Kill zombies
cool
Nice 🙂
hello as I compile the application so that you can see on a big screen
I wonder how to move this kind of app to mobile phone and create control system on it? Thank you!
In a similar way I did it here with buttons:
https://www.ourtechart.com/augmented-reality/tutorial/augmented-reality-user-interface-unity3d/
and here with exporting the app:
https://www.ourtechart.com/augmented-reality/tutorial/augmented-reality-android-app-export/
How about jump control. ? It just for movement control. I’m so confused. .
Sign me up! We are in awe with this AMAZING project! We are ready to get our hands & minds on it!
First day of Summer Break, notification popped up for newest demo…my boys dropped EVERYTHING to hover & watch!
Love it!!
What is this for?
please tutorial part 3¡¡¡ you are awesome bro¡¡
Does this work with android?
Yes.
Hello
I am a fan of your page
In Tutorial No. 39 you put some jpg images as example .
How do paragraph colcoar OTHER jpg images is no unity ?
Put some tried but HE DID NOT accepted
Grateful
great tutorial but how can test on iphone
hi, I think you could just build .apk file in unity and import it to your device, and install manually. it should work
cool i like
when i import videoplayback package, i got these error:
Assets/Common/MenuOptions.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/AsyncSceneLoader.cs(7,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
Assets/Common/SplashAbout/LoadingScreen.cs(10,19): error CS0234: The type or namespace name `UI’ does not exist in the namespace `UnityEngine’. Are you missing an assembly reference?
how to solve it?
the score between the two avatars is real or fictional? if it is real as is done?
Real.
https://www.ourtechart.com/augmented-reality/tutorial/augmented-reality-eye-tracking/ I want to learn this tutorial,thanks
can I use simple webcam or kinetic cams??
Very good job !
WYSWYG : What you see in this tuto is what you get in download
Thanks
I’m just building the unity 3D project on my Samsung and the panel and buttons are not appearing. They appear when I test it using UNITY 3D but not on my phone.
Do you have any idea of what could it be?
Eres genial, tu contenido es digno de una clase de maestria, este juego esta demasiado bien y tiene mucho potencial de diversas maneras, pero creo que te hace falta el manejo de marketing digital. si necesitas ayuda con eso yo se un poco jejeje, espero que sigas haciendo este tipo de contenidos y espero que tus proyectos sean un exito.
Saludos desde Colombia.
How to make a model?? thanks
Could u show me how to make it? It’s so great
How can do that for a static 3d character, no animation??
If 3d model are rigged then you can do that with static model (without animation). Head move together with head bone. For all model (parent gameobject) left/right rotation just use one of the rotation methods (RotateAround, eulerAngles). 🙂
please I’m student and starting my GP i need help i wanna know how to start with augmented app with android devices i wanna use android studio , step by step my idea is face tracking too
can u plz help me ?
I have the same problem as:
” I did everything step by step but my videos wont play. It shows up but it gives me the x image and when i click on it it gives me the loading image forever” also I can’t find this file “AppManger.cs”. Any idea I use the latest unity and Vuforia plugins
I noticed there were questions on the videos being inverted upon tests. Mine is doing the same. I have tried all suggested. Can any help regarding where the proper axis change is made?
Current setting for ImageTarget is: X -0.1 Y 0.1 Z 0.1
Thanks in advance.
tutorial please.
good idea! learn from U! thx!
hi, thanks for your tutorials. These tutorials are great help for beginners.
I’m facing a small problem please guide me through, when i press arrow keys player animate perfectly and rotate also but didn’t move physically on plane, animate only on fix point.. Thanks in advance 🙂
Your website is awesome. I discovered it like several months ago, but always thought that this requirement of having a target image is somewhat cumbersome. Thank you very much, sir!
I ‘ve tried to make video playback like this. but the unity said that “IsampleAppUIEventHandler” cannot be found. it’s because I dont have that file in my project. so where I can get that file ???
hi
In your opinion, which one is better for a good quality in AR project?
KUDAN or VUFORIA.
Both 🙂
Vuforia can achieve it?
No, Vuforia at the moment don’t have SLAM.
Thank you very much for your sharing!
Hello, i`m from Brazil Manaus,
FIRST congratulate BY ITS TUTORIALS this note 1000, I’ve been doing this now put in my Unity 32 and 64-bit generate an error when starting the camera , already put the api key editor and still generate an error with the name.
DllNotFoundException : KudanPlugin
Kudan.AR.TrackerWindows.StopInput ( ) (at Assets / KudanAR / Scripts / Classes / TrackerWindows.cs : 96)
Kudan.AR.KudanTracker.OnDestroy ( ) (at Assets / KudanAR / Scripts / Components / KudanTracker.cs : 438 )
if possible send a help in the email would be very grateful
hugs
switch to android or ios.
excuse me, I don’t know why my bullet will fly disorderly when I change the AR camera(use the card let virtual thing appear) and image target.
excuse me,can i ask where is “virtual button event handler” because it’s not in the vuforia scripts.
wow,so cool.
I only know the shooting scenario in unity ,but really want to know how to make this game.
I FOLLOWED YOUR TUTORIAL.ITS VERY EXCELLENT. BUT IAM NOT ABLE TO CONTROL THE ANIMATION.IN GAMW VIEW ITS VERY LARGE.CAN YOU PLEASE EXPLAIN HOW TO CONTROL THE ANIMATION?
Keyboard keys left/right/forward/backward.
Where can we get the “Change Motion” Button? That’s a whole different story isn’t it?
sir i have an issue pleease help me my target image is not showing in unity3d please help me……..it show white in unity
Hello,
Thanks a lot for such useful and detail instructions! I’m just starting exploring how to create AR with Vuforia and Unity. And these tutorials definitely come in handy 🙂
I tried to follow this tutorial. But unfortunately there no such property for a button (like in your video 7:11). Here’s a screenshot what I see: http://prntscr.com/c5cqi4 . There is no init() function. I tried to use start() instead but it didn’t generate that 2nd script where you change some code (from private function to public).
I’m using Unity 5.4.0 and Vuforia 6 (tested on v5 as well).
Can you please explain me what I’m doing wrong and how to fix it? Thank you so much in advance! Hope you’ll find time to answer.
Keep up doing awesome things! 😉
Best regards,
Alex
hello this video I likt it how to make it how to study I come from china
How do I run on 64bit unity?
Vuforia works only on 32 bit unity.
thanks!
This video is so amazing!
I want to know how to do that. Can you send me source code or tutorial of this project.
Thank you!
https://drive.google.com/open?id=0BygvzTqnzm_wblduNnVSdmllZ00
I received it. Thank you so much!
Hello! Can you explain me please, how can i put this code into my 3d objects to make them transform?
hello guys
great work you,
excuse me, when the truth will raise the tutorial interested me too , and I want to learn more about the RA
atte
Fredy
It really helped me! you have such a great material thanks!
can you help me
Failed to load ‘Assets/KudanAR/Plugins/x86_64/KudanPlugin.dll’ with error ‘操作成功完成。
‘, GetDllDirectory returned ”. If GetDllDirectory returned non empty path, check that you’re using SetDirectoryDll correctly.
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:203)
Kudan.AR.KudanTracker:GetPlugin() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:196)
Kudan.AR.KudanTracker:Start() (at Assets/KudanAR/Scripts/Components/KudanTracker.cs:220)
Unity 64 bit? switched to Android platform?
yes , I switched to Android
Did anyone manage to resolve the above problem
why not code ? can you share with us
hi, may i know how caracter look each other? are you using LookAt in unity or what? becouse i want my caracter look each other , but still not found how.
Correct, LookAt.
Hi There, I am playing around with this and am wanting to have 5 pages instead of 3. For some reason when I add two more pages, the swipeimage script seems to malfunction, not allowing me to swipe at all. Any thoughts? I adjusted all the parameters I could think of to account for the new pages but I didn’t mess with the script at all. Would it need modification? It didn’t seem like it should…
hey, thanks for the tutorial…but the share button does nothing and all the other buttons work. I bought the plugin and followed the tutorial, is there a permission I should be adding or something has changed?
I have followed the instruction as above. However the plane could not automatically disappear unless I clicked it. After I click to disappear the plane, the cube or sphere is not appear. Please give some advice.
I am using Unity 5.2 and vuforia SDK 5.5.9 .
Could you tell me what the computational cost of this application cosiderando it running on a smartphone android
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
Hi Edgaras, I’m very interested your ar technique like in video, if possible could you make a tutorial or share some information that i can lookup about this.
I also in area of unity could you tell me how that you convert 2d coloring texture to map on the 3d model. please!
thank you!
Hello i need some help about virtual buttons
can you demonstrate how to make a virtual buttons do rotate and and move objects
Hi!!
This plugin works with ios ??
No. I already tried. It doesn’t work on iOS.
you are awesome, nice tutorial.
When i scan a plan using camera and loading bar, model is getting loaded in another plane. Is there anything i would have missed or got messed up? I followed your tutorial clearly!
can you help me for making project
Cloud Recognition even you will optain money from me
Hi Edgaras,
I follow your tutorials and they are great.
I have a problem.
I am using unity and vuforia (user defined target).
I am recognizing objects as targets (followed this tutorial), but my virtual 3D object and canvas are unstable, and the extended tracking doesn’t work like in image targets.
Have some experience with this, does this happened to you sometime.
I will explain, I have a sculpture to recognize, and I tried AR-Media object scaning solution, but is the app becomes slow and also unstable, that is why I am using user defined targets to overcome my problems with object recognition.
No audio in android player? Do I need to do something specific?
There should be audio, nothing additionally is necessary.
why my appmanager.cs got error
#region PUBLIC_MEMBER_VARIABLES
public string TitleForAboutPage = “About”;
public ISampleAppUIEventHandler m_UIEventHandler; (The type or namespace name ‘ISampleAppUIEventHandler’ could not be found)
#endregion PUBLIC_MEMBER_VARIABLES
#region PROTECTED_MEMBER_VARIABLES
public static ViewType mActiveViewType;
public enum ViewType { SPLASHVIEW, ABOUTVIEW, UIVIEW, ARCAMERAVIEW };
#endregion PROTECTED_MEMBER_VARIABLES
#region PRIVATE_MEMBER_VARIABLES
private SplashScreenView mSplashView;
private AboutScreenView mAboutView;
private float mSecondsVisible = 4.0f;
#endregion PRIVATE_MEMBER_VARIABLES
//This gets called from SceneManager’s Start()
public virtual void InitManager()
{
mSplashView = new SplashScreenView();
mAboutView = new AboutScreenView();
mAboutView.SetTitle(TitleForAboutPage);
mAboutView.OnStartButtonTapped += OnAboutStartButtonTapped;
m_UIEventHandler.CloseView += OnTappedOnCloseButton;
m_UIEventHandler.GoToAboutPage += OnTappedOnGoToAboutPage;
InputController.SingleTapped += OnSingleTapped;
InputController.DoubleTapped += OnDoubleTapped;
InputController.BackButtonTapped += OnBackButtonTapped;
mSplashView.LoadView();
StartCoroutine(LoadAboutPageForFirstTime());
mActiveViewType = ViewType.SPLASHVIEW;
}
public virtual void DeInitManager()
{
// mSplashView.UnLoadView();
// mAboutView.UnLoadView();
// m_UIEventHandler.CloseView -= OnAboutStartButtonTapped;
// m_UIEventHandler.GoToAboutPage -= OnTappedOnGoToAboutPage;
InputController.SingleTapped -= OnSingleTapped;
InputController.DoubleTapped -= OnDoubleTapped;
InputController.BackButtonTapped -= OnBackButtonTapped;
m_UIEventHandler.UnBind();
}
public virtual void UpdateManager()
{
//Does nothing but anyone extending AppManager can run their update calls here
}
public virtual void Draw()
{
m_UIEventHandler.UpdateView(false);
switch (mActiveViewType)
{
case ViewType.SPLASHVIEW:
// mSplashView.UpdateUI(true);
break;
case ViewType.ABOUTVIEW:
mAboutView.UpdateUI(true);
break;
case ViewType.UIVIEW:
m_UIEventHandler.UpdateView(true);
break;
case ViewType.ARCAMERAVIEW:
break;
}
}
#region UNITY_MONOBEHAVIOUR_METHODS
#endregion UNITY_MONOBEHAVIOUR_METHODS
#region PRIVATE_METHODS
private void OnSingleTapped()
{
if (mActiveViewType == ViewType.ARCAMERAVIEW)
{
// trigger focus once
m_UIEventHandler.TriggerAutoFocus();
}
}
private void OnDoubleTapped()
{
if (mActiveViewType == ViewType.ARCAMERAVIEW)
{
mActiveViewType = ViewType.UIVIEW;
}
}
private void OnTappedOnGoToAboutPage()
{
mActiveViewType = ViewType.ABOUTVIEW;
}
private void OnBackButtonTapped()
{
if (mActiveViewType == ViewType.ABOUTVIEW)
{
Application.Quit();
}
else if (mActiveViewType == ViewType.UIVIEW) //Hide UIMenu and Show ARCameraView
{
mActiveViewType = ViewType.ARCAMERAVIEW;
}
else if (mActiveViewType == ViewType.ARCAMERAVIEW) //if it’s in ARCameraView
{
mActiveViewType = ViewType.ABOUTVIEW;
}
}
private void OnTappedOnCloseButton()
{
mActiveViewType = ViewType.ARCAMERAVIEW;
}
private void OnAboutStartButtonTapped()
{
mActiveViewType = ViewType.ARCAMERAVIEW;
}
private IEnumerator LoadAboutPageForFirstTime()
{
yield return new WaitForSeconds(mSecondsVisible);
mSplashView.UnLoadView();
mAboutView.LoadView();
mActiveViewType = ViewType.ABOUTVIEW;
m_UIEventHandler.Bind();
yield return null;
}
#endregion PRIVATE_METHODS
someone help me
Very nice. Where can I find / download the target image?
I am running NyAR4psg/3.0.5;NyARToolkit/5.0.9 in processing 2.2.1 with a Microsoft LifeCam HD-5000 on windows 7. When I run simpleLite, the background (camera) image appears only in the upper right corner of the window. It shows the lower left of the camera view. If the background image was correct the tracking appears to be correct. I looked in the reference material and found public void drawBackground (processing.core.PImage i_img)
This function draws the PImage to the background. PImage draws in part of farclip surface +1.
This function is equivalent to the following code.
:
PMatrix3D Om = New PMatrix3D (((PGrapPGraphicsOpenGLhics3D) G) .Projection);
SetBackgroundOrtho (Img.Width, Img.Height)
pushMatrix ();
ResetMatrix ();
Translate (0, 0, – (Far * 0.99F));
Image (img, -Width / 2, -Height / 2);
popMatrix ();
SetPerspective (Om);
:
My approach was to sub this code in for the line “nya.drawBackground(cam);” then mess with the translate to correct the issue. But I get a “syntax error, maybe a missing semicolon?” – I added a semi-colon to the end of the second line SetBackgroundOrtho (Img.Width, Img.Height); and It still hangs on the first line with the same error.
Any help would be appreciated.
Can you use the kinect’s rgb as the input video feed for marker based AR?
Hi,
Please help me.
I downloded the Augmented Reality Vespa User Interface – Mimic No. 1.. Really this is only interface. So I don’t test the projekt.
Where can I download the motor image?
The page updated with a tracker image.
hi, already send for inquiry for demo, hope to hear you soon.
thanks
Hello ,Recent how don’t have any video updates?
Hi, i need to know if i have to buy a 3d sensor camera for built a game with smart terrain or i can use the traditional camera of my smartphone? Thank you
You can use traditional camera.
Hello, interactive tutorial series can be a video presentation, charges can only look at some places don’t know much about the project
Hello Team,
Thank you for providing this nice platform, We are looking for a really good developer who can develop this paint functionality for us, We are already working on our product and need to integrate that part in it ( we are using Unity3D, Vuforia, C#).
The basic requirement, app should recognize/read the colors from the marker and apply it on the model itself.
Looking forward to hear from you soon.
Regards
ABID
P.S. I’ll be submitting few cool AR demos to this site, very soon 🙂
I just wanna say you are great.I have no words to than you.You are amazing.You rock.You are the best
Hi…
good job
Please, let’s education Ar toolkit
Hi, thanks for the tutorial!
I’m having an issue with the screenshot aspect ratio. When I take a screenshot (in landscape or portrait mode), the image comes out vertically stretched (or horizontally squished). I tested it in 3 android devices, same in all 3. The images come out normal when i take a screenshot in unity on my mac.
After a lot of research, I still can’t figure out the cause.
Do you have any suggestions?
Thank you
Hi hello good day, 🙂
Thank you so much for the tutorial! Really appreciated it. 🙂
But anyway, do you have any idea on how to reset the distance value once it is on “OnTrackingLost”?
Cause everytime I need separate it first(during scan the object), then the particles effects will be destroyed.
Or else, it will still remain on top of Image Target when I scan for the second round even I didn’t connect the paper.
I would be greatly appreciated if anyone could help with this problem. Thank you! :)
When I scan only one part of the image alone for the second round, the particles still sticks to the image even I didn’t pair up with another image target.
I’ve tried few solutions, but it seems too many errors come out,
One of it, I tried put parts of these inside OnTrackingLost() section,
”
string NameTarget = “imageTarget_” + mTrackableBehaviour.TrackableName;
GameObject target = GameObject.Find (NameTarget);
transform.position=new Vector3(0,0,0);
”
in order to reposition the sphere back to normal position when tracking lost, but it seems like not working cause I’m not pretty good in c# coding,
Hi
Can I use this solution in android apps?
Can MS Kinect drivers works on android with phone camera?
Did You test it?
Thanks for the answer a lot.
Hey Guys, i have used your tutorial to make the simple video playback app, and its working great, i just want to know, how we can change size of video appearing after tracking the image target ?
Please i need help
Where can i buy full version?
can you help me to create an app with AR
how to increase the rate of scaling videoplayback in unity . i want my scaling should be doubled how can i do this
how to increase the rate of scaling videoplayback in unity . i want my scaling should be doubled how can i do this ?
Did you use Particle WIFI Module with antenna??
It’s without antenna.
Hi,
Thanks for the tutorial, you make really easy tutorials.
1 question: with the current script, none of the UI element gets visible on the screenshot.
Is it possible to get a logo/Signature on the top left of the screen, so that that also gets displayed in the screenshot?
I tried but was unable to find success
why kudan package ont work on Unity 32 bit
I don’t know, probably low support, lack of human resources…
Could I get breadboard circuit diagram??
Hello, I have follow all the tutorial properly, but when I connect my laptop to kinect, the picture won’t open. I don’t know what happen, do you know to solve this problem?
Hello! Great tutorial, but when copy SnapShot.cs
CS0246 C# The type or namespace name “AndroidUltimatePluginController” could not be found (are you missing a using directive or an assembly reference?)
What to do now?
is this possible to making augmented reality based android application in processing 3 ? if this possible then how to do this?
I suggest you drop out Processing 3 and use Unity3D. Don’t waste time on Processing.
And one more , Can we use our own marker ?
can you please let me know if you have any apps for sale
Hi,im interested in how do that and May you make a tutorial about how do that?
how to export the AR app in unity, I always get the error message : Unable to locate Android SDK
Its because you need to install the sdks in the website of android.
Downloading android studio will help you then you need to locate inside unity the path
Thats it.
I try to run your project in unity 3d its amazing! Thankyou. But when I built it to an application, it can’t detection my webcam. Do you know why? Please give me an answer, thankyou
context.enableUser(); when playing this sketch at this line this error showing “The function”context.enableUser();” expects parameter like”context.enableUser(int);” ”
Please help me to retrive
Could you explain a little bit about the math for target tracking?
Many thx.
Could you explain about the math please ?
Great tutorial! how can I download the cube multi target ? I need to print the QC paper to create the cube to be able to try this tutorial!
Thanks.
I never success create AR files using vuforia and unity.. i use desktop pc dont have any camera can i do it with this specification Desktop PC, win 10, 16 gb ram
Nice 🙂
Assets/VirtualButtonEventHandler.cs(5,14): error CS0101: The namespace `global::’ already contains a definition for `VirtualButtonEventHandler’ What about this Error……………?
[…] https://www.ourtechart.com/augmented-reality/tutorial/kudan-slam-technique-pokemon-go/ […]
and can u give a toutorial that requires write c# code in order to control the virtual image??
I just think youre great.
hi sir
your tutorials are great. thanks for uploading…
can we integrate 2 or more videos with single Image target and make next and previous buttons to change between videos…
is it possible ?
thanks in advance
Hi there,
Thank You for the Video
I am new using Unity and all these stuff
I followed each step
But I had an error after i removed the Utility folder.in Minute 10:34
This is the error:
Assets/ZigFu/Scripts/Viewers/ZigDepthmapToParticles.cs(19,13): warning CS0618: `UnityEngine.ParticleEmitter’ is obsolete: `This component is part of the