linux: LIRC_GET_REC_RESOLUTION should return microseconds (from media_tree)

This commit is contained in:
MilhouseVH 2017-07-29 21:24:08 +01:00
parent 8e96d1ddce
commit 7815dacf22

View File

@ -0,0 +1,33 @@
From 29b67f9aaa06adae4db23b775fa92b0e376a36a3 Mon Sep 17 00:00:00 2001
From: Sean Young <sean@mess.org>
Date: Fri, 7 Jul 2017 18:49:18 -0300
Subject: media: lirc: LIRC_GET_REC_RESOLUTION should return microseconds
Since commit e8f4818895b3 ("[media] lirc: advertise
LIRC_CAN_GET_REC_RESOLUTION and improve") lircd uses the ioctl
LIRC_GET_REC_RESOLUTION to determine the shortest pulse or space that
the hardware can detect. This breaks decoding in lirc because lircd
expects the answer in microseconds, but nanoseconds is returned.
Cc: <stable@vger.kernel.org> # v2.6.36+
Reported-by: Derek <user.vdr@gmail.com>
Tested-by: Derek <user.vdr@gmail.com>
Signed-off-by: Sean Young <sean@mess.org>
Signed-off-by: Mauro Carvalho Chehab <mchehab@s-opensource.com>
diff --git a/drivers/media/rc/ir-lirc-codec.c b/drivers/media/rc/ir-lirc-codec.c
index a30af91..d2223c0 100644
--- a/drivers/media/rc/ir-lirc-codec.c
+++ b/drivers/media/rc/ir-lirc-codec.c
@@ -266,7 +266,7 @@ static long ir_lirc_ioctl(struct file *filep, unsigned int cmd,
if (!dev->rx_resolution)
return -ENOTTY;
- val = dev->rx_resolution;
+ val = dev->rx_resolution / 1000;
break;
case LIRC_SET_WIDEBAND_RECEIVER:
--
cgit v0.10.2